Continuing with the theme of my last DW comment – ‘everything is new, yet everything remains the same’ (to paraphrase the rather more elegant French original) – this issue of DW includes a major article on Immersed Computing. The idea of liquid cooling has been around for quite some time, but has never really gained major traction. I suspect that this is almost entirely down to the fact that the combination of liquids, electricity and IT hardware scares most of us. Simplistically, I suspect we have all dropped, or know of someone who has dropped, a mobile phone/tablet/iPad etc. into the bath/down the toilet etc. and we know that the drenched item never quite works the same again, if at all!
So, no matter how elegant, sensible and trustworthy liquid cooling might sound, there’s always that fear that, if something does go wrong, it’s not going to sounds too clever if the boss discovers that the data centre failed because of some kind of liquid leak or flood. No matter that the cooling technologies that are used in the data centre owe more to the ancient Greeks than to any major 21st century technology breakthrough.
However, as the quest for digital transformation gains momentum, it’s becoming increasingly obvious that the folks that are embracing this objective most successfully are those that are prepared to challenge the received wisdom of generations of IT and data centre specialists. It might not be necessary to start with a completely blank piece of paper, but it’s certainly a good idea to ensure that your sheet has a margin where you can park all the tried and trusted ideas, and bring them in where it does not make sense to embrace some of the more recent ideas. And I say ideas, because most, if not all, of the technologies these ideas are based upon have been around for quite a while. What’s changed is that, where once these technologies were slow and inefficient, many of them are now reaching a level of maturity and reliability that makes them viable for use in the world of IT.
Liquid cooling (along with Cloud, virtualisation, the edge, SSDs, IoT…) might not be for everyone, but you’d be foolish to ignore the claims of the technology without examining its potential. If I’ve understood it correctly, not only does it keep your IT hardware nice and comfortable, there’s also the possibility for using the waste heat from the process. Scandinavia and other parts of mainland Europe in particular seem to like the idea of re-purposing waste energy and, while it doesn’t always make financial sense, it’s just another ‘new’ idea to consider.
Of course, all of the new ideas and technologies currently being promoted across the IT universe only make true sense if all parts of the business work together. So, if you want to become a truly digital business, it’s time to knock down those silos, tear down the divides between departments and make sure that everyone understands what’s available and what it can do for the overall business. Yep, you need to establish a working group that includes data centres, IT, sales, marketing, finance, compliance, HR and more. Not easy, but pretty much essential to ensure that the impact of any idea under consideration is fully understood by all.
New research reveals 80 percent of companies at risk of being left behind as most digital innovation projects fail to meet expectations.
Despite spending millions of dollars on digital transformation in the past year, enterprises still feel they are at significant risk of being left behind by their industries, research from Couchbase
shows. In the survey of 450 heads of digital transformation for enterprises across the U.S., U.K., France, and Germany, 80 percent are at risk of being left behind by digital transformation while 54 percent believe organizations that don’t keep up with digital transformation will go out of business or be absorbed by a competitor within four years. And IT leaders are also at risk, with 73 percent believing they could be fired as the result of a poorly implemented or failing digital project.
Other findings include:
“Our study puts a spotlight on the harsh reality that despite allocating millions of dollars towards digital transformation projects, most companies are only seeing marginal returns and realizing this trajectory won’t enable them to compete effectively in the future,” said Matt Cain, CEO of Couchbase. “With 87 percent of IT leaders concerned that their revenue will drop if they don’t significantly improve their customers’ experiences, it’s critical that they focus on projects designed to increase customer engagement. Key to succeeding here is selecting the right underlying database technology that can leverage dynamic data to its full potential across any platform and deliver the personal, highly responsive experiences that customers are demanding today.”
Ninety percent of IT leaders said their plans to use data for new digital services were limited by factors such as the complexity of using multiple technologies or a lack of resources, as well as reliance on legacy database technology.
Survey respondents identified specific issues with legacy databases that could lead to digital projects underperforming:
"Historically, some enterprises haven’t done well at using data to improve customer experience, which is why digitally native companies have made some giant inroads in traditionally brick & mortar businesses,” said John A. De Goes, CTO of SlamData Inc. “If all enterprises want to thrive, they need the confidence, ability, and technology to reinvigorate the customer experience. They need a revolution in the way they use data, to transform the customer experience and provide a data-driven way of truly engaging with end-users."
Findings from CITO Research and Commvault reveal that while business leaders are rapidly embracing the cloud, 81 percent are very concerned about missing out on new cloud advancements.
Commvault has published the results of a new executive survey that found that 81 percent of C-level and other IT leaders are either extremely concerned or very concerned about missing out on cloud advancements. The survey, which was conducted in partnership with IT research firm CITO Research, demonstrates that CEOs, CIOs and CTOs are experiencing serious cloud Fear of Missing Out (FOMO).Alert Logic has published the results of a comprehensive research, “Cybersecurity Trends 2017 Spotlight Report,” that explores the latest cybersecurity trends and organisational investment priorities among companies in the UK, Benelux and Nordics.
Conducted amongst 317 security professionals, the survey indicates that while cloud adoption is on the rise, the top concern is how to secure data in the cloud and protect against data loss (48 per cent). The next two biggest priorities for security professionals were threats to data privacy (43 per cent) and regulatory compliance (39 per cent).
The study also examined the top constraints faced by these organisations in securing cloud computing infrastructures. The study found that organisations lack internal security resources and expertise to cope with the growing demands of protecting data, systems and applications against increasingly sophisticated threats (42 per cent). This is closely followed by a desire to reduce the cost of security (33 per cent), moving to continuous 24x7 security coverage (29 per cent), improving compliance (24 per cent) and increasing the speed of response to incidents (20 per cent).Public cloud platform providers like Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform offer many security measures, but organisations are ultimately responsible for securing their own data and the applications running on those cloud platforms.
According to Verizon’s recent security report, attacks on web applications are now the no. 1 source of data enterprise breaches, up 300 per cent since 2014. Similarly, the report found cybersecurity professionals – more than half of survey participants – to be most concerned about customer-facing web applications introducing security risk to their business (53 per cent). This is followed by mobile applications (48 per cent), desktop applications (33 per cent) and business applications such as ERP platforms (31 per cent). Application related breaches have negative consequences and can lead to revenue loss, significant recovery expense, and damaged reputation.
“Web applications are the most significant source of breaches for organisations leveraging cloud and cloud hybrid computing infrastructures,” said Oliver Pinson-Roxburgh, EMEA Director at Alert Logic. “They are complex, with a large attack surface that can be compromised at any layer of the application stack and often utilise open source and third-party development tools that can introduce vulnerabilities into an enterprise.”
Organisations can implement incentives to prevent gaps in the security policy of an application or to avoid vulnerabilities in the underlying system that are caused by flaws in the design, development, deployment, upgrade, maintenance or database of the application. Additionally, many businesses turn to cloud security vendors with a “products + services” strategy rather than technologies alone to fight web application attacks. Businesses increasingly find that a combination of cloud-native security tools provided in combination with 24x7 security monitoring by security and compliance experts is the best way to secure their sensitive data – and the sensitive data of their customers – in the cloud.“
A multi-layer web application attack defence is the cornerstone of any effective cloud security solution and strategy,” said Pinson-Roxburgh.
A new update of the Worldwide Semiannual Small and Medium Business Spending Guide from International Data Corporation (IDC) forecasts that total IT spending by small and medium-size businesses (SMBs) will approach $568 billion in 2017 and increase by more than $100 billion to exceed $676 billion in 2021. With a five-year compound annual growth rate (CAGR) of 4.5%, spending by businesses with fewer than 1,000 employees on IT hardware, software, and services, including business services, is expected to be slightly stronger than IDC's previous forecast.
"SMB IT spending growth continues to track about two percentage points higher than GDP growth across regions. But beneath that slowly rising tide are faster moving currents that reflect the changing ways SMBs are acquiring and deploying technology," said Ray Boggs, vice president, SMB Research at IDC.
SMBs around the world are increasingly interested in investing in resources to improve employee productivity and improve their competitive positions. Boggs noted that while SMBs, especially smaller ones, have immediate tactical needs to sharpen performance, they are also looking to coordinate resources in a meaningful way. For many this will be an important step towards Digital Transformation (DX).
SMBs will spread their IT investments about equally across the three major categories – hardware, software, and IT services – with these categories accounting for more than 85% of total SMB technology spending worldwide. While hardware purchases currently represent the largest share of this spending, IDC expects 2019 to be the watershed year when software and IT services spending both surpass hardware spending. The smallest of the major categories – business services – will see the greatest spending growth of the four technology categories at 7.1% CAGR, followed closely by software (6.9% CAGR).
One third of all SMB software purchases in 2017 will be from the top 3 applications categories: enterprise resource management (ERM), customer relationship management (CRM), and content applications. Application development & deployment and system infrastructure software will also be key areas for SMB software investment. Hardware spending will be led by purchases of PCs and peripherals, which accounted for almost half of SMB hardware spending in 2016 (49.6%) a share that will decline throughout the forecast period to 43.3%. SMB services spending is divided between IT services and business services. While SMB spending on IT services will continue to be more than twice that of business services throughout the forecast period, business services' share is growing, with spending growth roughly twice that of IT services (7.1% vs. 3.7% CAGR).
Medium-sized businesses (100-499 employees) will be the largest market throughout the 2016-2021 forecast with 38% of worldwide SMB IT products and services revenues coming from this group of companies. The remaining revenues will be generated about equally by large businesses (500-999 employees) and small businesses/small offices (1-99 employees). Medium and large firms will also experience the strongest spending growth with CAGRs of 4.6% and 4.5% respectively, slightly above small business spending growth of 4.4% The SMB opportunity for both near-term and long-term IT spending growth extends across all company size and technology categories.
"The Western European SMB market is big and growing even if European SMBs traditionally show a lower level of IT sophistication than their bigger counterparts and therefore they can represent a difficult target market. In this context we see today the rise of SMBs that were born in the digital era, that are very innovative and attracted by 3rd Platform and Innovation Accelerators (particularly cloud, mobility, and IoT). Even if these companies represent only a small percentage of the overall SMB market, they can set the scene and pave the way to a broader adoption of innovative IT solutions," said Angela Vacca , senior research Manager, Customer Insights & Analysis
.
The market for governance, risk and compliance (GRC) software is expected to experience strong growth as business leaders look for solutions to meet the challenges of regulatory change, cybersecurity threats, third-party exposure, and reputation risk. In the first forecast to size the overall GRC software market, International Data Corporation (IDC) sees worldwide revenues reaching $11.8 billion in 2021, growing at a compound annual rate of 6.7% over the 2016-2021 forecast period.
A number of factors are driving the growth in demand for GRC applications. Regulatory compliance has become increasingly complex and corporate governance, risk and compliance initiatives have come under greater scrutiny. Given the financial and reputational impact of high visibility compliance and security breaches, risk management has become a strategic level conversation, discussed among the C-suite and corporate board. At the same time, risk management responsibilities have started to shift downward toward the line of business owner as the first line of defense, making user engagement, ease of use, and integration with other enterprise applications just as important as reporting.
"Successful GRC vendors are developing more intuitive and configurable platforms, providing expanded integration and content options, and focusing on user engagement through automated reporting, alerting, and mobile accessibility," said Angela Gelnaw, senior research analyst, Legal, Risk & Compliance Solutions.
Another important factor driving growth in the GRC market is the rise of cloud solutions, which are growing faster than the overall market. Adoption of GRC applications among small and medium-sized businesses, line of business managers, and less-regulated industries have been central to the growth of these cloud-based solutions.
IDC defines governance, risk and compliance software as the aggregation of the tools required to help an enterprise identify, track, and analyze enterprise and technology risks and to monitor and manage corporate and IT governance and compliance initiatives to enhance performance and stay in compliance with global laws and regulations, industry standards, and company policies. IDC recognizes five segments in the GRC software market: GRC Integrated Suites, Corporate Governance & Compliance Management applications, Enterprise Risk Management applications, Audit Management solutions, and Business Resiliency applications. The GRC Integrated Suites segment comprises about 20% of the overall market and is expected to experience healthy growth throughout the forecast. The rest of the GRC market is fragmented across the other four segments.
Worldwide revenues for the augmented reality and virtual reality (AR/VR) market are forecast to increase by 100% or more over each of the next four years, according to the latest update to the Worldwide Semiannual Augmented and Virtual Reality Spending Guide from the International Data Corporation (IDC). Total spending on AR/VR products and services is expected to soar from $11.4 billion in 2017 to nearly $215 billion 2021, achieving a compound annual growth rate (CAGR) of 113.2% along the way.
The United States will be the region with the largest AR/VR spending total in 2017 ($3.2 billion), followed by Asia/Pacific (excluding Japan)(APeJ) ($3.0 billion) and Western Europe ($2.0 billion). But then things get interesting as APeJ jumps ahead of the U.S. in total spending for two years before its growth rate starts to slow in 2019. The U.S. then pushes back into the top position in 2020 driven by accelerating growth in the latter years of the forecast. Meanwhile, Western Europe is expected to overtake APeJ for the number 2 position in 2021. The regions that will experience the fastest growth over the 2016-2021 forecast period are Canada (145.2% CAGR), Central and Eastern Europe (133.5% CAGR), Western Europe (121.2% CAGR) and the United States (120.5% CAGR).
Within the regions, the industry segments driving AR/VR spending start from roughly the same place, but then evolve quite differently over time. The consumer segment will be the largest source of AR/VR revenues in each region in 2017. In the United States and Western Europe, the next largest segments are discrete manufacturing and process manufacturing. In contrast, the next largest segments in APeJ in 2017 are retail and education. Over the course of the forecast, the consumer segment in the U.S. will be overtaken by process manufacturing, government, discrete manufacturing, retail, construction, transportation, and professional services. In APeJ, the consumer segment will remain the largest area of spending throughout the forecast, followed by education, retail, transportation, and healthcare in 2021. Consumer spending will also lead the way in Western Europe, with discrete manufacturing, retail, and process manufacturing showing strong growth by the end of the forecast.
"The consumer, retail, and manufacturing segments will be the early leaders in AR & VR investment and adoption. However, as we see in the regions, other segments like government, transportation, and education will utilize the transformative capabilities of these technologies," said Marcus Torchia, research director of IDC Customer Insights & Analysis. "With use cases that span both AR & VR environments, we see a significant opportunity for companies to re-cast how users interact in business processes and everyday tasks."
"Augmented and virtual reality are gaining traction in commercial settings and we expect this trend will continue to accelerate," said Tom Mainelli, program vice president, Devices and AR/VR at IDC. "As next-generation hardware begins to appear, industry verticals will be among the first to embrace it. They will be utilizing cutting-edge software and services to do everything from increase worker productivity and safety to entice customers with customized, jaw-dropping experiences."
The industry use cases that will attract the largest AR/VR investments are also expected to evolve over the five-year forecast. In 2017, the largest industry use cases will be retail showcasing ($442 million), on-site assembly and safety ($362 million), and process manufacturing training ($309 million). By the end of the forecast, the largest industry use cases in terms will be industrial maintenance ($5.2 billion) and public infrastructure maintenance ($3.6 billion), followed by retail showcasing ($3.2 billion). In contrast, the consumer segment will be dominated by AR and VR games throughout the forecast, with total spending reaching $9.5 billion in 2021. The use cases that will see the fastest growth over the forecast period are lab & field (166.2% CAGR), therapy and physical rehabilitation (152.0% CAGR), and public infrastructure maintenance (138.4% CAGR).
Spending on VR systems, including viewers, software, consulting services, and systems integration services, are forecast to be greater than AR-related spending in 2017 and 2018, largely due to consumer uptake of hardware, games, and paid content. After 2018, AR spending will surge ahead as industries make significant purchases of AR software and viewers.
Achieving broad competence in event-driven IT will be a top three priority for the majority of global enterprise CIOs by 2020, according to Gartner, Inc. Defining an event-centric digital business strategy will be key to delivering on the growth agenda that many CEOs see as their highest business priority.
"Event-driven architecture (EDA) is a key technology approach to delivering this goal," said Anne Thomas, vice president and distinguished analyst at Gartner. "Digital business demands a rapid response to events. Organizations must be able to respond to and take advantage of 'business moments' and these real-time requirements are driving CIOs to make their application software more event-driven."
Because CEOs are focused on growth via digital business, CIOs should focus on defining an event-centric digital business strategy and articulate the business value of EDA. According to the Gartner 2017 CEO survey, 58 percent of CEOs see growth as their highest business priority. CEOs achieve growth by adopting new business models, introducing new products and services, expanding into new markets and geographies, upselling to existing customers and stealing market share from competitors.
"Findings from the survey clearly indicate that CEOs view digital business as their No. 1 opportunity for growth," said Ms. Thomas. "Most CEOs also recognize a triangular relationship between technology, product improvement and growth. They recognize that technology is the fundamental enabler of digital transformation and leading digital companies have figured out that EDA is the 'secret sauce' that gives them a competitive edge."
Event-centric processing is the native architecture for digital business, and to enable growth through digital business, strategic parts of the application portfolio will need to become event-driven. CIOs can use EDA to foster growth by enabling digital business transformation, capitalizing on digital business moments, using modern technologies, accelerating business agility and enabling application modernization.
"Event processing and analytics play a significant role in allowing organizations to capitalize on a business moment," said Ms. Thomas. "A convergence of events generates a business opportunity, and real-time analytics of those events, as well as current data and wider context data, can be used to influence a decision and generate a successful business outcome. But you can't capitalize on the business moment if you don't first recognize the convergence of events and the digital business opportunity."
This is why digital business is so dependent on EDA. The events generated by systems — customers, things and artificial intelligence (AI) — must be digitized so that they can be recognized and processed in real time. EDA will become an essential skill in supporting the transformation by 2018, meaning that application architecture and development teams must develop EDA competency now to prepare for next year's needs. CIOs should identify current projects where EDA can provide the most value to enable adoption of technology innovations such as microservices, the Internet of Things (IoT), AI, machine learning, blockchain and smart contracts.
A legacy application portfolio can be a significant inhibitor to digital business transformation. A digital business technology foundation must support continuous availability, massive scalability, automatic recovery and dynamic extensibility. Digital business applications must also use modern technologies to engage customers, support digital business ecosystems, capitalize on digital moments, and exploit AI and the IoT.
"Modernizing core application systems takes time, and few organizations are in a position to immediately move over to a replacement system," said Ms. Thomas. "Instead, they need to use EDA to stage their modernization efforts, and gradually migrate capabilities while implementing digital transformation."
Market hype and growing interest in artificial intelligence (AI) are pushing established software vendors to introduce AI into their product strategy, creating considerable confusion in the process, according to Gartner, Inc. Analysts predict that by 2020, AI technologies will be virtually pervasive in almost every new software product and service.
In January 2016, the term "artificial intelligence" was not in the top 100 search terms on gartner.com. By May 2017, the term ranked at No. 7, indicating the popularity of the topic and interest from Gartner clients in understanding how AI can and should be used as part of their digital business strategy. Gartner predicts that by 2020, AI will be a top five investment priority for more than 30 percent of CIOs.
"As AI accelerates up the Hype Cycle, many software providers are looking to stake their claim in the biggest gold rush in recent years," said Jim Hare, research vice president at Gartner. "AI offers exciting possibilities, but unfortunately, most vendors are focused on the goal of simply building and marketing an AI-based product rather than first identifying needs, potential uses and the business value to customers."
AI refers to systems that change behaviors without being explicitly programmed, based on data collected, usage analysis and other observations. While there is a widely held fear that AI will replace humans, the reality is that today's AI and machine learning technologies can and do greatly augment human capabilities. Machines can actually do some things better and faster than humans, once trained; the combination of machines and humans can accomplish more together than separately.
To successfully exploit the AI opportunity, technology providers need to understand how to respond to three key issues:
1) Lack of differentiation is creating confusion and delaying purchase decisions
The huge increase in startups and established vendors all claiming to offer AI products without any real differentiation is confusing buyers. More than 1,000 vendors with applications and platforms describe themselves as AI vendors, or say they employ AI in their products.
Similar to greenwashing, in which companies exaggerate the environmental-friendliness of their products or practices for business benefit, many technology vendors are now "AI washing" by applying the AI label a little too indiscriminately, according to Gartner. This widespread use of "AI washing" is already having real consequences for investment in the technology.
To build trust with end-user organisations vendors should focus on building a collection of case studies with quantifiable results achieved using AI.
"Use the term 'AI' wisely in your sales and marketing materials," Mr. Hare said. "Be clear what differentiates your AI offering and what problem it solves."
2) Proven, less complex machine-learning capabilities can address many end-user needs
Advancements in AI, such as deep learning, are getting a lot of buzz but are obfuscating the value of more straightforward, proven approaches. Gartner recommends that vendors use the simplest approach that can do the job over cutting-edge AI techniques.
3) Organisations lack the skills to evaluate, build and deploy AI solutions
More than half the respondents to Gartner's 2017 AI development strategies survey* indicated that the lack of necessary staff skills was the top challenge to adopting AI in their organisation.
The survey found organisations are currently seeking AI solutions that can improve decision making and process automation. If they had a choice, most organisations would prefer to buy embedded or packaged AI solutions rather than trying to build a custom solution.
"Software vendors need to focus on offering solutions to business problems rather than just cutting-edge technology," said Mr. Hare. "Highlight how your AI solution helps address the skills shortage and how it can deliver value faster than trying to build a custom AI solution in-house."
Although some CEOs might recognise that companies such as Uber or Airbnb are disrupting the business world, many still maintain a wait-and-see attitude. This usually means that they only respond once the threat to their business has been identified. But in the case of digital disruption, this approach simply will not do. There will not be enough time for business owners to respond in a manner that minimises impacts to their company.
By Janelle Hill, VP Distinguished Analyst, Gartner.The main problem associated with digital disruption is that it often exists outside of the organisation’s normal range of vision. Although CIOs and their business executives acknowledge the potential for digital disruption, they lack the necessary tools, techniques and criteria for identifying and assessing them.
Digital disruptions are more difficult to adapt to than earlier technology-triggered shifts as a result of their virtual nature. In the past, disruptions were typically caused by physical technologies such as PCs or ATMs. Digital disruptions, on the other hand, mostly exist in the virtual world. This makes them hard to detect until after the impact has been felt.
Fortunately, CIOs can lead an organisation to overcome the challenges of digital disruption and equip peers to recognise and deal with digital disruption.
There is a significant difference between real digital disruption and fads – which businesses must recognise. Examples of fads include Pokemon Go or Google Glass. They will incite lots of excitement but have limited impact. A real disruption will completely redefine the market’s needs and potentially cause a significant change in the industry. The introduction of the iPad, for instance, caused changes in application development, impacted revenue of desktop and notebook computer manufacturers, and even changed how humans interact with technology, with FaceTime as the first mobile conferencing application.
Enterprises looking to identify disruptors before it’s too late should set up a “sensing apparatus” to monitor external indicators. These indicators include shifting customer behaviour and consumer trends, as many disruptors originate in the consumer world.
Companies should also pay attention to where venture capitalists are investing and to disruptions in adjacent markets. The sensing apparatus will create a lot of information to handle, so look to data scientists who can be useful in uncovering insights.
Monitoring external industries is new territory for a CIO, but other members of the executive team will be better equipped for such an effort. Depending on the setup of the business, the CIO might look to partner with the CMO, CFO, VP of Supply Chain or the Head of R&D to gain a greater understanding of potential disruptors. In a business-to-business set up, disruption can happen in the supply chain or with the end customer, so it’s best to partner with both the CMO and VP of Supply Chain. For business-to-consumer companies, disruptions are most likely to happen in the customer segment, so the focus should be on the CMO.
CMOs can offer insight into customer and market behaviour. They will also be able to identify potential indicators and will probably have the staff with the skills to analyse the data. In return, CIOs can offer CMOs institutional knowledge about IT systems and why certain systems are set up the way they are to provide perspective on how a potential disruption challenges the status quo.
Once a disruption is identified, the organisation must figure out its response.
Learn more about CIO leadership and how to drive digital innovation to the core of your business at Gartner Symposium/ITxpo 2017, taking place 5-9 November in Barcelona.
DCS talks to Rolf Brink, Founder and CEO of Asperitas, about immersed computing and the tremendous potential value to data centres.
1. Please can you provide us with some background on the company – when/why formed and progress to date?
Asperitas was actually founded on May 2, 2014 by myself and Markus Mandemaker with a completely different focus. We intended to tackle a big problem in the marine industry, system integration on board ocean vessels. We wanted to create a micro cloud based infrastructure, but had to think of a way to create an affordable, lean and mean micro datacentre on board a ship. Think of a rolling ship, salt air, no IT staff and a lack of conditioned rooms. This is how we came up with the idea of immersion. We actually designed a small, mobile air-tight immersion system which would be cooled with seawater as part of the engine cooling circuit. However, just before we hit the button to start the manufacturing of the first prototype, we realised that all other aspects of immersion, combined with our focus on integrating technologies were a lot more interesting and important for a completely different industry which we were more familiar with; datacentres. This was in April 2015. In the following 2 years, we spent our efforts to develop an immersion solution which was suitable for normal datacentre environments to allow for more sustainability, flexibility and efficiency. We launched the result of this development in March this year and ever since we have been in the spotlight for datacentre efficiency and innovation and our pipeline is gradually filling up.
2. And who are the key personnel involved?
We started with just Markus and myself. We realised that to invent something which is compatible with datacentres, we had to get the right people involved. Since this can be quite challenging, we rely on partners for a lot of the knowledge and expertise we require. In order to allow a fast, high quality scale up with a limited risk, we built an ecosystem of development and delivery partners with the right knowledge and expertise.
This approach has the result that we can remain relatively small ourselves. Our key staff is focused on managing the consortium of partners, market development and delivery management. This is where Maikel Bouricius (Marketing), Leon Lips (Sales) and Els Knijff (Operations) are playing the most important roles.
3. Please can you summarise the Asperitas ‘immersed computing’ proposition?
Asperitas’ Immersed Computing is an integrated approach to IT immersion and the first of its kind that makes immersion viable for datacentres.
The key issue we had to solve when we started developing was the usability aspect. Immersion cooling already exists and is positioned in a niche part of high performance computing. It was actually already patented in the late 60’s by a small US based company called “International Business Machines”. You may have heard of them…
There are however significant challenges when it comes to the practical use, which are sometimes of a lower priority in the HPC industry. This is why we were not so much focused on researching immersion itself, after all, it is already a proven technology. Instead, we were much more interested in addressing the practical implications of adopting immersion. We did extensive research into the reasons why immersion would not break through in the cloud industry, where density challenges are constantly growing. Each reason became an instant design requirement. This has become the foundation of “Immersed Computing”, as opposed to cooling.
Immersed Computing addresses the entire way of work. It focuses on the integration of technologies, handling, tooling, efficiency and vision related to immersion. This results in the end-to-end proposition with our core immersion product at the centre. It includes the self-contained and self-sustained AIC24, the service trolley which is a semi-automatic servicing system, IT service tooling and several types of containment tools. Next to this, it also includes work principles related to liquid management, disaster management and knowledge around material compatibility, including a management environment which generates insight.
4. In more detail, can you talk us through the potential advantages of liquid cooling within the data centre, starting with the potential for significant heat recovery/reuse?
Applying immersion, which is also called ”Total Liquid Cooling”, results in an environment where 100% of the IT thermal energy is captured in liquid. Since liquid can contain much more heat and transport this much farther compared to air, it allows you to harvest the heat very easily. We apply an integrated water cooled heat exchanger inside our immersion system and also focus on insulation of the entire system. This means that nearly all heat is captured in the water circuit.
Water is an easy medium to apply and can carry this energy along large distances without significant energy loss. The fact that warm water can be used for cooling, allows datacentres to eliminate chiller systems completely. This is a significant cost saving.
Furthermore, by bringing liquid to the rack or IT itself, prevents the use of raised floors, isle separation schemes and other large scale air handling features inside the datacentres.
Finally, the energy efficiency of the facility is highly improved. Since energy is saved both on the cooling infrastructure and the IT, by removing fans overhead, the power supporting infrastructure like UPS and no-break systems are also reduced.
Other advantages include noise reduction, since we eliminate moving parts from the IT, and increased density and reduced floor space.
5. And there’s also the possibility of allowing for increased temperatures within the data centre?
Since the application of immersion also allows you to use high cooling temperatures, we have already tested several systems based on 55C water input which is hot water cooling, you can effectively draw hot water with 60C directly from our immersion environment. This is of course not for all IT systems and platforms, but still it shows the potential of liquid.
The adoption of liquid in a broader sense allows datacentre to become effective heat producers, especially if they can line-up different liquid technologies to create a high temperature difference. This is a process called temperature chaining and allows for other liquid technologies to contribute to the datacentre thermal production. Our whitepaper about the Datacentre of the Future describes a hybrid environment which is compatible with all types of IT, but applies solely water infrastructures and a mix of liquid technologies with the purpose of creating reusable heat. The result is a near-energy neutral datacentre.
6. And liquid cooling has a smaller footprint than other technologies?
Yes, this is correct. Since liquid has a much higher heat capacity than air, IT can be positioned closer together. Also, a lot of space is traditionally used for airflow. Think about it, to allow air to flow through a rack, you need space in front and at the rear of the rack, solely for airflow. This is greatly reduced with liquid.
7. And the maintenance requirements of liquid cooling are less than for other cooling options?
Yes and no. IT should require less maintenance, but liquid cooled IT requires a little bit more time for maintenance. Therefore, on the IT side, there is not much change. The impact can be found on the facility side. Since the cooling infrastructure is drastically changed and simplified, the maintenance focus is also simplified. There are no issues with dust, air quality or moisture. Water circuits require a lot less space than air infrastructures and the elimination of chiller systems reduces the maintenance required.
Especially with a heat producing datacentre for reuse, most of the infrastructure disappears completely. Obviously you don’t need to maintain what you don’t have.
8. And then there’s the added flexibility that liquid cooling brings with it?
This is actually related to the maintenance and minimal requirements for the facility. This means that it becomes a lot easier to find and utilise locations for datacentre space. Ranging from micro-edge facilities to large core datacentres, the overhead installations are greatly reduced, as is the environmental impact. Think about it, immersion makes no sound, requires no chillers, allows for immediate reuse of heat and needs no air. These are perfect ingredients to quickly deploy any type of datacentre environment and solve someone else’s heat challenge at the same time.
9. You talk of the data centre of the future and refer to data centres as ‘information facilities’ – is this message being well-received or are you still being frustrated by the response of ‘we’ve always done it this way, why change?’?
Well, we’ve only recently started sharing our vision of the datacentre of the future and honestly, everybody knows it already. Information is the sole purpose of the datacentre, and every industry professional will agree. The problem is just that most people in the industry seem to forget this bigger picture in their day-to-day business lives and blindly follow existing ways of work.
We knew this would be the most difficult part of our go-to-market. To convince the industry that there is a different, more effective way of looking at efficiency, platforms and resiliency. This is why we have the strategy which is driven by collaboration, sharing and openness. We want to share as much as we can about the ease of liquid, how you can deal with challenges in a different way and sending out of the box ideas into the world.
10. Indeed, do you think that the majority of data centre owners/operators are ready for the Asperitas message?
I’m not sure, but it looks like it. I certainly hope so, not only for Asperitas’ sake. The industry is causing a lot of energy spillage and the power grids in the world are stretched to their limits. The issues with global warming are real and can only be addressed with smart and integrated approaches to energy challenges. This is where datacentres really can show their potential to address global energy and sustainability challenges. The reality is that the business case for this must be sound before the industry will respond. The liquid approach to datacentre infrastructure is a compelling business case with short ROI timelines of less than 5 years, but a TCO approach to liquid is required to show this effect. So it really comes down to decision makers being open to innovation and change. It will probably take some time for our customers to demonstrate that they can be more competitive with a liquid based facility, before our technologies and ideas are adopted on a large scale.
There’s been much talk of green data centres, but not that much action, do you think that this has to change into the future?
Yes, absolutely, although I don’t necessarily agree with the statement. We have seen an incredible move towards more “efficient” datacentres driven by PUE. PUE however is not an indication of efficiency. As soon as a PUE of 1.3 or lower is achieved, a datacentre could be considered to be “green”. Truth be told, there is too much to say about PUE which undermines the “green” message. Everyone in the industry is familiar with too many ways to manipulate the PUE figures. Putting the internal electric grid in the IT balance, raising environmental temperatures which cause fans to work harder, not applying reuse scenarios because moving the energy negatively impacts PUE and many many more.
On top of this, datacentres are compared with each other which makes no sense whatsoever. A datacentre in Scandinavia with a PUE of 1.1 can easily be much less efficient than one with a PUE of 1.7 in the Mediterranean area.
PUE has done a terrific job for awareness and improving energy efficiency, but it needs to be abandoned as soon as the facility hits 1.3 in colder regions, or 1.7 in warmer areas. Simply because it gets in the way of today’s innovations. Therefore a new metric must be adopted to allow for real and smart energy efficiency beyond the datacentre itself.
11. And is a carbon-neutral data centre, or at least one that produces almost as much waste heat/energy for re-use as it consumes, is a very real possibility?
Yes, absolutely! As long as we’re talking about a carbon neutral OPERATION. We can run the datacentre with a carbon neutral approach to start with. This is fully feasible with technologies available today. Nearly all electrical energy which is consumed by the datacentre can be re-captured as heat which can flow out of the facility in the shape of hot water, ready to be consumed by other industries.
However, you have to understand that most of us have no clue about what was involved in the manufacturing process of all technologies which we can find in a datacentre, let alone the carbon impact of everything. This is something which also needs to be addressed. At Asperitas we take this seriously, which is why we manufacture everything in The Netherlands where we can assure circular and environmentally friendly processes. We also have sufficient insight and focus on our supply chain to ensure sustainability for our complete product development and manufacturing.
12. Over and above what we’ve discussed so far, are there any other technologies and/or trends that you see as making an impact on the data centre of the future?
Yes, the global adoption of district heating, also low grade heat networks. These are great developments where datacentres can build upon. Also the numerous smart city developments are terrific examples which drive the Datacentre of the Future concept. This is all driven by a focus on creating synergy between completely unrelated industries which require heat like spas, hospitals and hotels, agriculture, food industry, urban farms, households or offices, you name it.
13. And the need for agility?
Especially the agility is something which is optimally addressed with Immersed Computing. New environments can be positioned wherever they are required. New facilities, more capacity in existing facilities, the flexibility of water as the main cooling medium, the flexibility with different technologies in the hybrid model, you name it.
14. And, increasingly, the need for sustainability?
Yes. Sustainability is still too much in the background of doing business. I’m happy that the biggest challenge is already overcome by now though. Sustainability used to be a dirty word, associated with tree hugging. Today it is perceived as “sexy”, good for business and a common sense approach to any industry. This is a really big change in global business.
The next challenge is to make the shift from marketing to real focus and actual investments. Therefore we need to really look good at ourselves, our business models and our way of work. Basically a more holistic approach.
This also means that we’re not only talking about energy, but also about circular economy, not buying or implementing what is not really required and being smart about reducing overhead. At Asperitas we’ve taken this to the next level as well. Everything we do is focused on circularity. Each part of our product can be refurbished or recycled. Even the used oil is not consumed, but will be returned to the manufacturer to get a new purpose.
15. There’s been much talk of green data centres, but not that much action, do you think that this has to change into the future?
Yes, absolutely, although I don’t necessarily agree with the statement. We have seen an incredible move towards more “efficient” datacentres driven by PUE. PUE however is not an indication of efficiency. As soon as a PUE of 1.3 or lower is achieved, a datacentre could be considered to be “green”. Truth be told, there is too much to say about PUE which undermines the “green” message. Everyone in the industry is familiar with too many ways to manipulate the PUE figures. Putting the internal electric grid in the IT balance, raising environmental temperatures which cause fans to work harder, not applying reuse scenarios because moving the energy negatively impacts PUE and many many more.
On top of this, datacentres are compared with each other which makes no sense whatsoever. A datacentre in Scandinavia with a PUE of 1.1 can easily be much less efficient than one with a PUE of 1.7 in the Mediterranean area.
PUE has done a terrific job for awareness and improving energy efficiency, but it needs to be abandoned as soon as the facility hits 1.3 in colder regions, or 1.7 in warmer areas. Simply because it gets in the way of today’s innovations. Therefore a new metric must be adopted to allow for real and smart energy efficiency beyond the datacentre itself.
16. And is a carbon-neutral data centre, or at least one that produces almost as much waste heat/energy for re-use as it consumes, is a very real possibility?
Yes, absolutely! As long as we’re talking about a carbon neutral OPERATION. We can run the datacentre with a carbon neutral approach to start with. This is fully feasible with technologies available today. Nearly all electrical energy which is consumed by the datacentre can be re-captured as heat which can flow out of the facility in the shape of hot water, ready to be consumed by other industries.
However, you have to understand that most of us have no clue about what was involved in the manufacturing process of all technologies which we can find in a datacentre, let alone the carbon impact of everything. This is something which also needs to be addressed. At Asperitas we take this seriously, which is why we manufacture everything in The Netherlands where we can assure circular and environmentally friendly processes. We also have sufficient insight and focus on our supply chain to ensure sustainability for our complete product development and manufacturing.
17. If the greening of the data centre is desired and to be achieved, presumably it does need a radical re-think of the technologies being used, in terms of the data centre infrastructure?
This is where the combination of liquid technologies have to play a big role. There is not a single liquid technology which can address all end-to-end datacentre environments. Especially when it is about running carbon-neutral, more liquid technologies should be adopted in a hybrid model. This way we can apply temperature chaining to achieve really high temperatures in a liquid circuit.
18. We seem to be heading towards a hybrid data centre world – a mixture of web-scale/large, centralised facilities and more and more remote/edge, smaller facilities – how does Asperitas’s technology fit in both these scenarios?
Well. The Immersed Computing approach allows robust datacentre environments while at the same time allows you to get into places where traditional datacentre builds can never go. The distributed datacentre model with core and edge facilities are addressed in such a way that sites can be qualified based on energy reuse and drastically minimised “overhead” installations for power and cooling.
19. Over and above what we’ve discussed so far, are there any other technologies and/or trends that you see as making an impact on the data centre of the future?
Yes, the global adoption of district heating, also low grade heat networks. These are great developments where datacentres can build upon. Also the numerous smart city developments are terrific examples which drive the Datacentre of the Future concept. This is all driven by a focus on creating synergy between completely unrelated industries which require heat like spas, hospitals and hotels, agriculture, food industry, urban farms, households or offices, you name it.
20. Moving on to Asperitas the company, you have a range of business partners who work with the company. Can you share some of the work that goes on in this development consortium?
We involved leading companies to aid us and therefore we could progress very fast and at the right level of professionalism. The consortium consists of several different types of partners. Mainly development, technology and experience. All partners in the consortium have contributed to the development with some type of investment. Not just with money, but also with knowledge, man hours, technology, network or exposure.
The work that goes on is mostly related to creating smart solutions for dealing with Immersed Computing. Everything ranging from designing with a sketchbook, to CAD, to prototyping, to testing, to re-engineering and manufacturing is done with our business partners.
Other business partners are closely involved with us in order to develop fundamentally new approaches for the industry. An example of this is the collaborative work which we have done together with our partner Tebodin on our whitepaper about the Datacentre of the Future.
21. And Asperitas also has a wide range of sponsors, divided up into advisory, development and technology partners. Again, could you share with us some of the work that is carried out within these relationships?
The actual product development has been done in close collaboration with our high quality engineering partners like ADSE (Aircraft Development and Systems Engineering), Brink Industrial, a large steel manufacturer in The Netherlands, Perf-IT (management and monitoring), Aqualectra (electrical and control) and Total (Oil).
The technology partners have played a significant role in the development focus of specific parts of our solution, like Schleifenbauer (PDUs), SuperMicro, Bachmann, Starline and Mink.
Finally there is a range of advisory partners who helped us to focus on the viability of the end to end approach. Partners like Leeds University, Netherlands Enterprise Agency and GreenIT. Are great examples of this. Other advisory partners have been able to help us focus on the real usability priorities die to their extensive experience with existing immersion technologies. Partners like Vienna Scientific Cluster and ClusterVision have been very valuable to this focus.
22. In terms of your routes to market, what coverage does Asperitas have to date?
That’s hard to say. We’re focused on cloud providers and we’re in the spotlight when it comes to the infrastructures for cloud at the moment. We’ve gained a lot of traction with the media by sharing a lot of our vision and about the way we do things. Media however is not a measure of the willingness to adopt our technology. Looking at the response from potential customers however, we seem to be getting pretty far. We’re lining up with the biggest and the smallest of businesses and we’re having very good discussions with varying levels of decision makers.
Social media and digital media are large contributors to our market exposure today and I’d say that our coverage looks quite good. We have brought the AIC24 to various events including Datacentre Transformation, Cloud Expo Europe and the ISC and we make a point not to just bring the system, but also run it live to prove the point about simplicity, usability and flexibility. This shows how easy it is to adopt Immersed Computing. We have had great attention at the events so far and you will see us more often in the second half of 2017.
23. And what are your expansion plans?
Opportunistic really. We’ll see where the market takes us and we’ll expand accordingly. We originally planned to start in The Netherlands, but we’re already getting a lot of attention on a larger scale. This means that we’ll be looking at a UK presence, but we’re also considering alternative ways of addressing the market requirements. We’ve already got some ideas on delivery and support through partner channels which allows us to address a larger part of the industry.
24. And do you have any customer success stories and/or trials that you can share with us, please?
We have an early prototype installation running since November 2016 at our development partner VSC and they wish to continue using this system much longer than we ever planned for. They are already running a large immersion environment and they are very fond of our approach to immersion which is a great boost.
Another example started as a demo site we opened at Schuberg Philis a bit more than one month ago in June. It was meant to be a temporary demo site, now they don’t want us to take it away again. That demo system is now used to facilitate compute power for cancer research and we turned them into a customer.
Ever since our launch in March, we’re getting more and more interest. We’re preparing for more implementations and our pipeline is really building up at the moment. Hopefully by this time next year we’ll have a good portfolio of customer successes.
25. Finally, do you think that the truly optimised data centre environment will only be brought about when the facilities and IT folks actually work as one big team, and not in their separate silos?
Yes. It is a common problem with any holistic approach to bring efficiency in any business. Different silos operate within their own boundaries and rarely communicate. Each silo has their own responsibilities and is accountable for each of them. Breaking down walls is the basis of any disruption and those organisations which can deal without these walls are the ones which can make a big difference and can become more competitive than others, simply because they can take efficiency beyond the boundaries of individual disciplines.
26. Any other comments?
Asperitas is a very open organisation and driven by partnerships. We are also driven by sharing as much about liquid and our technology as we can. This means that we’ll be sharing a lot more in the upcoming period and there are many more visions which we will share and pursue. We will bring as much as we can into practice and welcome anyone who wishes to join in this voyage, regardless of whether they are collaborators, suppliers, customers or competitors.
It is often difficult to see the full viability of our technologies and ideas since it is difficult to get a fairly complicated message across. This comes from the fundamentally different way we look at datacentres. For example, we never talk about energy consumption or the cooling of IT. Instead, we talk about thermal production and IT health. This in turn illustrates how we look at IT equipment and the maintenance and facility aspects. This is what brings us to these fundamentally different views on information versus physical footprint.
We have learned that experiencing the technology itself plays a critical role in understanding us, our messages and the fundamentally different foundation upon which a datacentre can be built.
Datacentres are big electrical heaters and we cool them. Why not make them more effective heaters instead?
Pervasive visibility and automation are critical to the rapid detection and mitigation of sophisticated security risks.
By Adrian Rowley, Technical Director EMEA for Gigamon.
It’s a sad fact, but many cybersecurity professionals have been forced to come to terms with the inevitability of security breaches resulting from two key factors. Security teams face ever-greater challenges in combatting data breaches due to the sheer speed of data traversing networks, which leaves insufficient time for decision-making related to potential threats; and the continuous growth in the number of attackers and the ecosystem of resources available to break through standard defences and propagate undetected across most network infrastructures.
The traditional security focus of instrumenting networks for prevention, and concentrating resources on a perimeter that can no longer be defined, is increasingly ineffective in today’s environment. Organisations are also hampered by limited visibility, extraordinary costs, growing infrastructure complexity, and reliance on manual processes to address security incidents.
At 100Gb network speeds, the inter-packet gap of 6.7 nanoseconds surpasses an organisation’s ability to perform intelligent application security, threat detection or inspection. Security Operations teams and technology are therefore being overwhelmed in trying to manage and mitigate an increasing volume and variety of incidents. And this machine-to-human fight favours the attacker, leaving organisations severely disadvantaged.
How best to address this critical situation? There is considerable industry recognition of the need for integrated and automated security architectures that help to mitigate these kinds of risks. According to Gartner1, for example, “Strategies for business continuity and disaster recovery will fundamentally change as enterprise and information are spread everywhere. Continuous visibility and understanding of systems, services, assets and partners is needed as digital business infrastructure will be in a state of constant flux.”
Meanwhile, Dan Cummins, senior analyst at 451 Research, has said that “Siloed security systems and data cannot accelerate or provide a basis for advanced prevention, detection and remediation activities, nor for process-driven security management. To address current threats and unseen risks ahead, organisations need to move towards a unified, collaborative and data-powered security framework that enables shorter cycle times for incident response and resolution while ensuring network performance and business continuity.”
The last few years have seen an exponential increase in the number of different security tools on the market, and there’s also been a lot of talk about machine learning, artificial intelligence (AI) and security orchestration. The problem is that it hasn’t been clear how these play together to improve a company’s security posture. If you deploy these technologies are you more secure, less secure, or in the same situation as before? This has been difficult to assess because there hasn’t been a model against which organisations can measure security success or understand where any gaps remain.
Organisations are also at different phases of the security cycle. Many are in the first stage of doing the basics of providing firewalls, segmentation and multi-factor authentication. Some have moved beyond this and are beginning to build out a baseline, leveraging machine learning techniques, big data and open source and commercial tools. Only a few are in the automation phase as this is relatively new - although we expect to see more organisations starting to deploy aspects of automation and orchestration in 2018.
Another problem is that organisations need a model that addresses practical industry challenges including a massive shortage of skilled personnel, exponential growth in the volume of attacks, and manual, siloed processes. People are therefore asking questions such as how can we automate, how do we deal with the API explosion, and what is the role of machine learning and AI?
Those organisations that have started to build out their infrastructure have tended to do so in a relatively ad hoc manner, which may or may not take them to where they want to be. But one example of the ‘integrated and automated security architectures’ cited by Gartner is Gigamon’s new Defender Lifecycle Model, which is all about providing a structured approach that organisations can use to get to their desired outcome, quickly and efficiently.
Focused on a foundational layer of pervasive visibility and four key pillars - prevention, detection, prediction and containment - the model utilises a security delivery platform to deliver security services that can learn, detect, predict and contain threats throughout the attack lifecycle. This integrates machine learning, AI and security workflow automation to address the increasing speed, volume and polymorphic nature of network cyber threats, automate and accelerate threat identification and mitigation, and shift control and advantage away from the attacker and back to the defender.
The model also provides the intelligence, scale and flexibility to integrate with security tools such as firewalls and intrusion prevention systems to automate and accelerate threat containment and mitigation. With it, security professionals can map out the role of the various technologies involved in the threat ‘kill chain’, gain a better understanding of overall security readiness and gaps, understand how to automate and eliminate human and process bottlenecks to more effectively stay ahead of threats, and ultimately strengthen their organisation’s overall security risk posture and efficiencies.
In moving to an automation model you can begin to address two key challenges: the shortage of skilled personnel, and accelerating your ability to respond in a timely manner to contain and prevent attacks from propagating.
Another big advantage is easy access to data. Organisations can get data from routers, firewalls, endpoints, domain controllers etc. but the challenge is actually getting hold of it. Each of these entities is controlled by a different part of the IT organisation, and coordinating across these siloed departments is a challenge. Many of these approaches also add load on the devices, impacting their performance. So, simply leveraging network traffic becomes a shortcut to getting access to content-rich information.
In addition, machine learning addresses the big data challenge of security, which is gathering context from across an entire infrastructure and building a baseline; while AI applies algorithmic techniques on top of that to surface out anomalies. Automation and orchestration then provide the ability to act on those anomalies.
There are multiple aspects in which Gigamon plays into the machine learning, automation and containment, and initial basic hygiene phases. For example, as machine learning is all about big data and providing ways to assimilate large volumes of data and build a baseline, Gigamon provides easy access to content-rich data that allows companies to build that baseline. In terms of automation, the platform offers an alternative to dealing with the massive API explosion by providing a default API to orchestrate various pieces of solutions. And if you want to deploy a basic good hygiene technique like firewalls, Gigamon makes it easy to do so without having to deal with network maintenance windows or outages.
In this respect, Gigamon is not only an enabler of the machine learning, AI, automation and containment layers - it’s a foundation upon which enterprise network defences can be layered and, more importantly, efficiently leveraged.
1 Gartner, Inc., Use a CARTA Strategic Approach to Embrace Digital Business Opportunities in an Era of Advanced Threats, Neil MacDonald, Felix Gaehtgens, May 22, 2017.
Any business that wants to compete in the digital age has to take advantage of digital channels in every way possible. In fact, beyond leveraging the smaller-scale applications we’re all familiar with, all organisations should be undertaking digital transformation at a fundamental level. At its heart it is optimally connecting people, processes, and content to achieve a competitive advantage.
By John Newton CTO and Founder of Alfresco.
The digital enterprise works differently from the pre-digital enterprise. It’s globally connected. It’s immediately responsive. It’s massively collaborative. It’s mobile, data-driven, and always on. It never stops changing. Yet many are not taking this shift in business thinking on-board and are lagging behind. There are five ways to tell if you're company will flourish or vanish in the digital economy:
Addressing and focusing business activities on new and adjacent markets sets digital-first leaders apart. Avoiding disruption from new entrants and start-ups is key and their priorities lie in optimising the customer experience and constantly introducing new products. Whereas those that lag behind often concentrate on cost-cutting, efficiency and increasing revenue from existing products rather than how to evolve.
Successful digital transformation must start at the top. It is a board level agenda item and the CEO must drive and take responsibility for the programme. This should be seen as the foundation for businesses in the future. If digital transformation duties are delegated down the command tree, as though it is just under the umbrella of technology and not fully prioritised in its own right, then this results in companies falling further behind.
Lagging companies' objectives are all over the place, and if they are motivated, they normally prioritise refreshing technology and cutting costs. In companies that are digital leaders, there is a focused agenda for digital transformation. This is often centred on engaging customers more effectively and building out their ecosystem using new and digital technology.
Looking forward, we seem to be on the cusp of major changes in digital transformation, with disruptors revamping ecosystems to connect internal systems to their customers, partners and suppliers more efficiently. This will have the effect of transforming how they develop their value chains to be more digital and integrated – giving them a real competitive advantage. But if you think digital leaders are ahead today, their intentions over the next three years will become even more aggressive to ensure they stay ahead.
The majority of digital leaders have embraced open standards, open APIs and open source. Openness and open thinking are a core part of digital transformation. This is critical to making the digital connections to customers and partners, enabling them to transform how they do their business in a more agile and flexible way.
In fact, recent research found that roughly half of the CEOs at fast-growing companies are taking the bull by the horns. What are doing right? There are three main things that gives them a competitive edge:
In order to achieve this, there are three levers that organisations can adopt to catalyse a new approach to their business and deploy a successful programme toward digital transformation:
Design thinking -- The optimal flow between the user, what they need and their experience should drive business technology decisions.
Open thinking -- Collaboration is a powerful business accelerator. Innovation from both inside and outside the organisation is encouraged to drive new initiatives.
Platform thinking -- A single, central solution through which you can route information, automate processes, and integrate third-party innovation. This results in the exchange of capabilities and data in a manner that creates added value, repeatable experiences and connects users with information and/or services quickly and meaningfully.
Today's corporate leaders must realise that they need to disrupt or risk being disrupted. Those that are not yet thinking about how they will innovate with new approaches and leveraging technology to its utmost are at risk. Digital transformation is not just a critical stepping stone to success, but key to an organisation's very survival.
The ever-evolving digital landscape has led to the emergence of a multitude of digital technologies. New technology however inevitably brings with it both exciting new opportunities, but also challenges, which need to be approached with a note of caution.
By Manoj Karanth, GM and Head – Big Data Analytics and Cloud, Digital Business, Mindtree.
Cloud technology has been at the heart of this. Companies across sectors have taken swift action to adopt and incorporate some form of cloud computing offering into their digital approach. These new technologies have the capacity to reshape the competitive landscape, but they must be managed in the right way to be effective.
The latest global survey from HBR Analytic Services, released in April of last year found that 85 per cent of organisations surveyed, across a host of sectors, territories and industries, planned to use cloud tools, moderately, to extensively, over the course of the next three years. This figure, if nothing else, is symbolic of the importance that businesses are affording to the ability the cloud has to reshape organisations’ approach to digital.
The cloud has the capacity to increase businesses speed and agility, shape new business models, foster innovation and ideas, and expand collaboration across the workforce. It is no surprise therefore that the survey results revealed a high adoption rate.
Aligning business and IT objectives is crucial to widespread successful implementation of cloud technology across organisations. IT teams cannot act alone, no matter what capacity they have. Business leaders need to take action too.
Together, they need to play a mutually strategic role in constructing new business models to both leverage and incorporate cloud technologies in agile and innovative ways. If not, they risk losing considerable market share or left in their competitors’ wake altogether. This is the transition however that the vast majority of companies struggle with the most.
A recent Cloud User survey from leading market research firm Frost & Sullivan highlights the crux of this problem. According to the results, it goes as far to say that 57 per cent of IT decision-makers consider migration a major obstacle to moving their workloads to the cloud.
Sadly, unlike the 1989 hit film Field of Dreams, depicting a novice farmer with a vision to transform his cornfield into a baseball field, and, by doing so, save his farm in the process, the cloud migration does not function in the same vain. Cloud migration is not simply a case of “if you build it, he will come”.
Leveraging cloud technologies to grow a business takes more than vision. It requires having the right people on your team to execute on it, and is simply too complex and risky to plan and implement alone.
Mapping the right route to a successful cloud journey
Strategy needs to be front and centre. Without a comprehensive road map and measurable business goals, migrating successfully will always be too big a mountain to climb.
The right anchor partner can help you customise and develop the right business case, using metrics and incremental ROI objectives as markers along that journey. An anchor partner can analyse and evaluate the best cloud approach (whether public, private or hybrid) and cloud deployment type (IaaS, PaaS or SaaS) to achieve strategic business goals. Under these banners, falls readiness assessments, change management and audits to assess your security and system architecture requirements.
Once these initial strategic foundations have been laid, an anchor partner is then responsible for determining the proper sequence of application and data migration. By carrying out this deep portfolio analysis, this includes examining the current state of an application-to-cloud suitability analysis, migration intentions and developing a suitable timeline, as well as enhancing them.
Doing this work from the ground up builds the basis of a unilateral view of the company’s application infrastructure and then identifies, and prioritises the relevant enterprise systems, applications and data that critically need to be moved to the cloud.
Keep your eyes on the ball
Whether it’s moving data sets or entire workloads, transitioning to the cloud ties up IT resources, which are needed elsewhere. An anchor partner is the glue that holds this all together. They are with you every step of the way, allowing the IT team to stay focussed on what it does best – building and supporting great products and services.
Businesses may begin in the cloud, but every journey is different for each organisation. No digital transformation is the same. Choosing the right anchor partner provides businesses with faster application implementation and deployment, reduced infrastructure overheads and greater flexibility to scale resources off the cuff.
Whether businesses do this by freeing up budgets for digital transformation or tapping into cutting-edge ways to store, analyse, and segment big data, improving market efficacy through personalisation will always be priority number one.
Businesses ultimately want to maximise the opportunities that the cloud has to offer. This is regardless of their current presence in the cloud.
Not that long ago, the Internet of Things (IoT) was a somewhat vague concept that seemed to belong to a distant future or sci-fi films. However, as technology continues to advance at a rapid pace and transform business models, IoT is quickly becoming a reality, revolutionising the way we live and work.
By Craig Smith, EMEA Director for IoT & Analytics Solutions and Services, Tech Data.
Businesses across the world are adapting to the new technologies associated with this and seeing the benefits that it can bring. From increased business efficiency to enhanced customer engagement, the potential rewards mean that now is the time to get involved, if you’re not already.
IoT is transforming the way businesses operate across many different industries. In retail, for example, digital signage, beacons and “smart shelves” are revolutionising the way that businesses communicate not only with their customers, but with different players in their supply chains, from shipping, to warehousing and distributing. Local governments are also seeing the benefit of IoT with applications like smart lighting and traffic systems allowing for more efficient allocation of budgets and resources. Sensors in our roads, for instance, can “talk” to vehicles and traffic light systems to optimise the flow of traffic in cities. The drive for city planners to adopt this type of technology is part of the move towards making smart cities a reality. Sadiq Khan, the Mayor of London, has recently shared his vision for London to become the world’s leading smart city. This will require the adoption of a range of smart technologies, moving towards more ambitious projects that make full use of the mass of data that city planners have at their disposal through smart technology.
To unlock the full potential of IoT and all it offers, a broad range of technical skills is required. Businesses need to have a full understanding of technology like sensors, gateways, networking, security, analytics, cloud, digital applications, software integration and API management, workflow and storage. They need to know how to apply these in order to create an integrated IoT “ecosystem” where data is automatically communicated across one seamless, manageable framework. The sheer breadth and depth of knowledge needed to drive success means that IoT transformation plans can be incredibly complex.
Adapting to this new technology and planning the transition is a major undertaking and presents a significant challenge to businesses. Experts in this new technology are needed, to guide the business through the transformation process and ensure that it happens as smoothly as possible. People at the board level must be able to not only understand the need for change, they must be able to justify why it is needed for the business and how it benefits customers.
One of the main challenges in leading IoT transformation will be gathering these skills and expertise together. Attempting to do this in-house is likely to require a significant investment of money, resources and time. Most IT providers will find themselves in financial risk if they attempt to onboard this new technology without any outside help. The solution lies in finding a partner to collaborate with, who will make use of your existing skills whilst providing expert guidance for areas you do not have knowledge in.
The right partner will provide invaluable customer support to navigate the transformation process. They could, for example, help customers to create bespoke sensors for specific applications adapted to the business. They would not only create the sensors, but provide the support necessary throughout the entire process, from design and testing, to creation of prototypes and CE certification, all the way to mass production. They could also provide valuable support at the other end of this type of project, for instance by working closely with software developers and data analytics experts, to augment data created by the sensors with data held in applications in different areas of the organisation. This means that a digital dashboard of insights could be created, providing a holistic overview of data from a range of applications, facilitating analysis and interpretation.
To get the best possible outcomes from an IoT solution, it is essential for teams to have access to the most up-to-date knowledge. This is crucial when we consider how rapidly technology is advancing and the huge number of new technologies being piloted. For a partner to fully understand IoT and its many business implications, good quality training is essential – change leaders may not already possess this technical knowledge. Partner education helps IT providers learn how best to deploy software and hardware solutions in complex real-world environments for their customers.
IT professionals benefit from learning new skills and gaining up-to-date certifications in two key ways. Firstly, they can distinguish their company from rivals through their ability to ensure better IoT experiences quickly. Secondly, first-class training helps IT professionals to build more fulfilling careers in their industry and enhances their personal development.
Instead of attempting to “go it alone”, IT providers have a lot to gain from partnering with a solution provider with the global scale and scope that means they can justify having the skills, expertise and training capabilities under one roof. Partners can often feel concerned before taking on their first Internet of Things project. The jargon, the possible technical difficulties with integration or even the hardware itself can all appear too much to get their head around. IoT projects require a broad range of different IT skills that don’t usually exist under one roof. However, the channel’s in a unique position because it has access to a technology landscape and partner ecosystem that is constantly evolving. To make a successful IoT project, those working in IT must be up to speed with the skills required to understand the ecosystem. Education in IoT solutions will equip channel partners with the necessary skills required to help them deliver transformational and innovative IoT projects.
IT providers and resellers need to look for partners who are true technology experts, with an unparalleled knowledge of the many different elements of IoT. Crucially, they must have a deep understanding of the intricacies behind each piece of technology being deployed. They must be able to explain in detail its business applications. It is also important that they are able to talk to the board of an organisation about the technology in-depth, sharing their vision of how applying it to the customer’s business will be transformational. In a way, they have to be visionaries – able to guide an organisation through its IoT journey from start to finish, communicating their vision in a way that is as compelling as it is credible.
As IoT continues to advance, revolutionising the way we work and live, it is becoming increasingly commonplace in both the private and public sectors. Products like hub-based smart home systems are also increasing in popularity – this shows that consumers are accustomed to the technology and therefore expect to see it in use, so adopting it now is key for companies hoping to remain competitive and keep ahead of the curve. Failing to adopt it means they risk falling behind the times and losing their customers to rivals who are actively incorporating the latest technologies into their businesses.
DW talks to Jed Ayres, IGEL’s Chief Marketing Officer, about the company’s mission to help companies optimise their end user computing environment, based on the mantra ‘Simple, Smart and Secure’.
1. Please can you provide some background on IGEL as a company?
IGEL helps companies optimise their end user computing environment. Developing and distributing endpoint technology since 1989, early on IGEL figured out that the core of the user experience was in the software that enables the hardware to “sing”; hence our passion to build both exemplary hardware and software.
In our R&D office in Augsburg, Germany, we have 75 engineers dedicated to developing new thin and zero client solutions. We incorporate and anticipate user needs so that IGEL’s thin client management system is radically more intuitive, secure and scalable than any other solution on the market.
Today, IGEL has a global customer base with:
· 17,000 customers worldwide
· Over 2m hardware clients
· 300k plus software clients
2. Including the major milestones to date?
1989 Introduction of green screen terminals leading onto the development of IGEL thin clients
2003 Launched IGEL’s remote management software to help IT managers centralise control of desktops
2006 The introduction of the first Universal Desktop. An evolution of the thin client, a modular system with a single, standardised operating system with the ability for individual customisation via a modular partition.
2010 The introduction of the Universal Desktop Converter software allows businesses to convert aging PCs and other thin clients to IGEL OS devices. This allows IT managers to control more of their desktops under one system.
2011-2015 Continued development of additional software solutions designed to improve the scalability, manageability and personalisation of endpoints, including the capability to manage Windows PCs in a virtual environment.
2016 The introduction of IGEL OS 10, our first 64-bit operating system. Closely followed by the launch of UD Pocket, the world’s first micro thin client designed to allow users access to their desktop while on the move, from any device; alongside IGEL Cloud Gateway, which enables secure access for any IGEL device that has access to the internet.
3. Please can you give us a brief overview of the IGEL product/technology portfolio? And how does this compare to other offerings in the market?
Unlike our competitors, IGEL is maniacally focused on the end-user experience and putting the tools in the hands of enterprise IT to effectively enable the end-user, without hamstringing them. Endpoints, IGEL OS and endpoint management is ALL we do! Our mantra is simple, smart and secure:
- Simple to configure and use with granular, context-aware settings
- Smart to manage and optimise, with real-time configurations
- Secure in terms of access and communications
We believe Windows is not an appropriate operating system to have on an endpoint, it is optimised for running applications, not for user performance. Applications and data belong in the Data Centre where they can be managed effectively and most importantly securely. Windows Server, for example, works well when it is patched and managed by IT. The endpoint device is much harder to manage, which is why IGEL OS, a read-only, Linux-based OS configured in layers is much safer. There are no WannaCry or Petya security issues with IGEL.
IGEL Product Portfolio comprises:
- Endpoints: thin, zero and all-in-one clients
- Conversion software: Universal Desktop Convertor (UDC)
- Conversion hardware: Universal Desktop Pocket (a USB drive with the full IGEL OS capable of converting any 64-bit x86 device to a thin client).
- Universal Management Suite (UMS): IGEL’s crown jewels for managing any IGEL OS powered or agent enabled device.
- Universal Management Agent: an agent for any Windows powered device to enable management by the IGEL UMS
- IGEL Cloud Gateway (ICG): seamless access to any device that can “see” the internet
- IGEL Management Interface: APIs to allow integration with other software tools
New £53 million campus at Ayrshire College gets modern VDI platform and IGEL technology to deliver a truly flexible learning environment
4. For example, IGEL claims to have ‘revolutionised the endpoint’?
Absolutely, businesses deserve an extremely configurable and secure device that enables users to connect to whatever virtual environment they need. Whether connecting to Citrix, VMware, or Microsoft; whether connecting from an internal network or from the public Internet; whether using almost any 64-bit x86 desktop, laptop, thin client or pocket device, our endpoint operating system, IGEL OS, provides a common experience for the users and ensures that they can maximise their productivity. We believe that’s a revolutionary approach and we are delivering it now.
Then our endpoint management software exists to make the life of the IT department easier.
We believe it should be as easy to remotely manage 10,000 devices as 10. That’s why we continually optimise our Universal Management Suite (UMS) to add the functionality that’s most important to enterprise IT and end-user productivity.
5. Please can you talk us through the IGEL software products, starting with the IGEL OS?
IGEL OS revolutionises access to virtualised desktops and applications. Currently in its 6th generation, this time-tested operating system standardises your endpoints, provides for adaptive configuration and granular control, while giving users a familiar, trouble free workspace. Supporting more remote display protocols than any solution on the market, IGEL Linux 10 is purpose-built for enterprise access to virtual environments of all types.
Benefits of the IGEL OS:
BUILT-IN ENTERPRISE-LEVEL SECURITY
Security conscious organisations can finally be confident that the core operating system on endpoints devices has not been compromised. Two-factor authentication, smart card readers and trusted execution are already included. Additionally, a Linux-based operating system is virtually impossible to manipulate and extremely resistant to viruses and other malware.
ENHANCED USER EXPERIENCE
Familiar, trouble-free desktop and application environments boost productivity. By moving the desktop PC workload from the endpoint to the data center, you experience true efficiency: significantly faster logins and application loading, more consistent operation and a significantly boosted performance in database lookup or query.
HARDWARE-AGNOSTIC
A secure operating system for x86 machines built with industry standard components, regardless of manufacturer. IGEL OS is designed to become the managed operating system for PCs, laptops, tablets, and most every other 64-bit, x-86 device, including thin clients, all-in-one clients and devices hardened for industrial use cases. With the UD pocket you can access your corporate desktop, ON PRETTY MUCH ANY PC or LAPTOP, anywhere! Just imagine being stranded somewhere and all you need is Internet access plus to borrow someone’s laptop and you are fully operational in seconds. Plus when you unplug the thumbnail-size drive from the laptop there is no residual impact, no security concerns, just pure instant productivity.
EASY CUSTOMISATION OF FIRMWARE
From added functionality to corporate branding to screensavers that display corporate messaging, IGEL OS is designed for managed customisation. It’s easy to make the device look and perform exactly the way you want it to, without having to overhaul your backend infrastructure.
COMMITMENT TO FIRMWARE SUPPORT
IGEL OS is updated at least four times a year, assuring access to current client software from Citrix, VMware, Microsoft and others. Furthermore, IGEL guarantees firmware updates for three years after the device is marked end-of-life.
MODULAR CONFIGURATION
Pick and choose functionality. IGEL OS is designed to let an organisation “turn off” unused features. If using Citrix and not VMware for end-user computing, just disable VMware. If using Microsoft RDSH with RemoteFX, and not Citrix, simply disable Citrix. Turning off unused features lets you give back resources to the system and reduce the attack surface of the device.
6. Moving on to endpoint management (UMS)?
Universal Management Suite (UMS) is IGEL’s unique endpoint management software.
Unlike Dell & HP whose confusing array of management tools only work on their own devices, IGEL offers a single endpoint management solution that gives IT automated backend control while delivering a familiar, trouble-free environment for users.
Purpose-built to simplify complex enterprise environments, UMS supports diverse operating systems, databases and directories. This smart, simple and secure management software lets IT easily manage any remote endpoint.
The IGEL UMS makes it possible to have one integrated, low touch system for universal endpoint management because it is:
· Hardware-agnostic. It manages any converted x86 device, regardless of manufacturer.
· Supports any operating system. Whether IGEL Linux OS, WES7, W7 or W10 desktop OS with an installed IGEL UMA extension software. UMS manages your existing environment within a single framework.
· Automated. You can instantly enroll, index and manage all endpoints from one intuitive backend system. No scripting necessary (though it is supported with IGEL IMI software.)
7. And the UMS add-ons?
THE IGEL UMS has a number of powerful add-ons to help meet the variety of specialist endpoint management needs of businesses. These include:
IGEL Cloud Gateway which extends the Universal Management Suite to endpoints running outside the company network, whether that’s in remote branch offices, at home offices or by roaming road warriors. Using only a standard internet connection, IGEL Cloud Gateway enables transparent and secure endpoint management anytime, anywhere.
High Availability, HA, is an optional extension for the IGEL Universal Management Suite (UMS) and enables the UMS to offer any degree of scalability, availability and redundancy. With HA, even large-scale thin client environments (500 or more end devices) can be simultaneously reconfigured.
With the Universal Customization Builder, UCB, the firmware for IGEL Universal Desktop thin clients can easily and reliably be expanded and adapted to meet your needs. For example, you may choose to install local device drivers or special applications. You can even set important Windows registry keys- without detailed knowledge of Shell or Windows scripting.
The Shared WorkPlace, SWP, allows users to hot desk seamlessly with user-dependent configuration based on setting profiles created in the IGEL Universal Management Suite and linked to the user accounts in the Active Directory
With the IGEL Management Interface, IMI, the IGEL Universal Management Suite (UMS) can easily connect via a standard REST API to existing enterprise management systems such as Microsoft System Center or IBM Tivoli. In addition, IMI provides the interface for REST-compatible programming languages to connect autonomous systems together.
The IGEL Unified Management Agent, UMA, enables devices running a Windows 7 or 10 operating system to be easily securely managed by our Universal Management Suite (UMS). In this way UMA-managed devices fit seamlessly into the IGEL solution portfolio – whether they are thin clients, notebooks or workstations.
8. And the desktop converter (UDC)?
With IGEL’s Universal Desktop Converter it’s possible to repurpose your existing hardware to an IGEL-like device – tunring your old hardware investments into manageable, useful devices.
The UDC converts almost any x86 device, regardless of manufacturer or form factor. This Thin Client conversion software takes only minutes to turn old devices into a universally deployable IGEL Linux-based device that can be easily managed by the IGEL UMS for remote support, zero-touch deployment and easy central management.
9. And the UD Pocket?
The IGEL UD Pocket is the world’s first micro thin client designed to allow users access to their desktop while on the move, from any device. No larger than a paper clip, the UD Pocket is plugged into the USB port of any PC, laptop or thin client compatible device and provides the mobile worker with access to their cloud services, server-based computing applications or virtual desktop. UD Pocket is automatically integrated into the IGEL Universal Management Suite (UMS) for remote support, deployment and management.
UD Pocket also features a perpetual license assigned to each device with regular software updates, making it a cost-effective solution for organisations of all sizes including those with:
· Bring your own device (BYOD)
· A remote workforce operating its own endpoints
· Suppliers, contractors and freelance workers requiring controlled access to the IT networks
10. In terms of the solutions that IGEL offers, what are the main ones relating to applications?
The IGEL OS offers a comprehensive choice of 3rd party solutions and peripheral devices:
End User Computing - IGEL Thin and Zero Client Software and Hardware can easily integrate in any IT environment, regardless of manufacturer or operating system. Universal access to cloud-hosted and server-based infrastructures makes it easy to future proof your investment. IT environments supported include: Citrix, Microsoft, Systancia, Ericom, RedHat, Cendio, VMware, Parallels, Flynet, Leostream and NoMachine.
Unified Communication - From telephone and e-mail to instant messaging and video conferencing, unified communications, UC unites all your communications technologies in one platform. IGEL Universal Desktop thin clients let you get the most out of modern teamwork, whatever its format. Support included for: Microsoft, Cisco, Ekiga and Sennheiser.
Authentication - Critical to any security plan, authentication confirms a user’s identity and determines whether he or she should access the network or the data before allowing them to proceed. Support included for: Imprivata, Caradigm, Evidian, Gemalto, ACS, Cherry, AthenaSecmaker, HID Global, Reiner SCT, SCM Microsystems, AET, Cryptovision.
USB Management - USB ports allow users to transfer external data or connect peripherals. It’s essential to manage these ports to ensure continued protection from security risks. Support included for: FabulaTech and CenterTools.
Dictation - Digital Dictation offers greater flexibility and reduced costs. The centralized storage of the dictation also increases data security. Support for: Grundig, Olympus, Sennheiser, Nuance, Phillips and Voicepoint.
Signature - From the signing of a simple document to complex contracts, e-signatures are recognized as a legal declaration of intent. Signature pads enable biometric handwritten signatures to be electronically captured with no media discontinuity. Support includes: Stepover, signotec and Wacom.
Printing and Scanning - IGEL Universal Desktops lay the foundation for bandwidth-optimized printing, efficient print management and scanning. Support for: ThinPrint and Crealogix.
Secure Tunneling - Thin clients do not exchange application data with the server. Still, keyboard input, such as passwords, must be protected from being intercepted. IGEL Universal Desktops support all popular security standards and offer a broad range of additional software clients for enhanced data security. Support for: Genua and NCP.
Terminal Emulation - IGEL Universal Desktop solutions offer efficient access to server-based applications, desktops and even legacy host systems. Due to its efficiency and security, server-based computing (SBC) is still widely used in many business sectors such as commerce, production, transport and logistics. Support for: Ericom, IBM, Flynet.
Multimedia: Support for Fluendo multimedia codecs.
11. IGEL works with a variety of strategic partners. Can you tell us a little bit about some of the work you do with these?
IGEL collaborates with leaders in software technology. We integrate the world’s best technology into our products so our customers will always get the most innovative and effective endpoint management solution.
Citrix is IGEL’s most important strategic partner. We work very closely with Citrix to build our enterprise business with key accounts. Our other strategic relationships include Microsoft, AMD and Intel to ensure our technology remains current, and that we are delivering the best performance possible to users of our solutions.
12. And you also work with OEM partners?
Yes, we have an OEM relationship with Samsung Electronics, one of the world’s most powerful technology brands, and an innovator in Thin Client cloud display technology. Samsung use IGEL Linux OS and IGEL Universal Management Suite 5 (UMS 5) software to power its new TC-L Series all-in-one Thin Client cloud displays.
Samsung’s TC222L and TC242L monitors bring the power, protection and capability of IGEL’s remote automation tools to Samsung’s pioneering display technologies.
This all-in-one Thin Client display combination simplifies the physical workspace, while providing the benefits of Citrix secure app and data delivery technology.
13. Cloud remains an industry buzz topic. How can IGEL help end users who are still working out their approach to Cloud?
IGEL believes in Cloud as the “Infrastructure as a Service”, and “Desktop as a Service” platforms for the future. It just makes sense to have someone else manage all the hassles of the Data Centre for you, particularly when there is a need to expand and contract resources, and add new users at short notice. As a result, we are working to deliver a superior end-user experience for the cloud vendors and for smaller Managed Service Providers (MSPs) offering similar services.
Moving forward, IGEL can radically simplify the provision of this type of service and remove security and management concerns for businesses adopting this model. For example, get starting is simple with IGEL’s UD Pocket – plug in the USB key, boot to USB, connect to the internet and automatically IGEL UMS will configure your endpoint device via the IGEL Cloud Gateway. All things that distract a user can be “turned off” in the configuration, so users have a much more direct way to get connected and be productive.
14. And there seems to be real momentum behind the Open Systems movement (finally!). How does IGEL play in this space?
Our approach is very pragmatic. We choose Linux because we think it offers the most simple, secure and flexible way to operate and manage endpoints. It allows IGEL to most effectively develop and cater for the needs of business and it allows our customers to deploy the most secure, manageable and easy to use endpoints.
If you examine the IDC marketshare data over the past 2 years you will see the only OS on thin clients with growth is Linux. All others are in steep declines including Dell’s proprietary ThinOS which lost nearly 20% marketshare in 2016. We believe we have the best Linux OS on the market especially when fused to our UMS which can control over 7,000 settings on a device.
15. And are there any other current or anticipated IT developments here you see IGEL providing value-add to end users?
With the increasing security threats of WannaCry and others, we see that traditional endpoint security solutions like virus scanners and VPNs are no longer sufficient. IGEL’s endpoints are inherently more secure from these threats than traditional desktops because of our small footprint, read-only and security-built-in OS. However, we will continue to work together with innovative and agile technology partners to provide our customers a toolkit, which enables them to respond to and centrally enforce security policies on their endpoints.
16. What is IGEL’s presence in the market in terms of channel/geographical and industry sector presence?
IGEL has offices worldwide and is represented by partners in over 50 countries with over 1,000 resellers and 60 plus technology alliances.
With more than 17,000 customers worldwide, IGEL has specialist expertise in industries such as Education, Financial, Insurance and Legal services, Healthcare, Government, Retail, Manufacturing and Logistics and Utilities.
17. And what plans do you have to grow?
As more and more businesses turn to server-based , cloud and virtual computing they are rethinking the way they serve and manage their endpoint devices. IGEL’s solutions have been built from the ground-up to solve these challenges and we see our future firmly in this market.
We also see a huge opportunity in the Cloud DaaS marketplace with our Endpoint management approach. Using IGEL solutions, Desktop As A Service providers can deliver secure, easily managed endpoints to their customers, using the customer’s existing estate of mobile devices without the need for expensive Virtual Private Networks. This approach will deliver optimum performance with virtually no management overhead giving these providers the opportunity to deliver their services at low cost.
Finally we see the world of IoT as a growth opportunity, where there is a need to securely control and manage millions of new types of devices (it is estimated there will be 50m devices connected by 2020). IGEL is working on a strategy to leverage our software IP to have a platform offering in this space, which would include agents on nodes, gateways and edge servers.
18. Looking ahead, what can we expect technology-wise from IGEL over the next year or so?
As we continue to develop our software driven proposition, you will see increased Asset Inventory Tracking capabilities to provide adminstrators with a complete view of their endpoint infrastructure, including all USB and Bluetooth connected peripherals.
We plan to further enhance our management software capabilities for any device – whether mobile or on the desk – and we expect to continue to focus on developing evermore robust security requirements for our customers with features such as contextual awareness of devices and secure boot support.
19. And what one piece of advice would you give to business trying to put together an overall endpoint management strategy?
Don’t fall short at the edge. To realise the gains achieved through virtualisation, cloud or server-based computing businesses need to choose a smart endpoint strategy.
Choosing the wrong endpoint OS can create chaos resulting in:
· Wasted time – finding out how difficult it is to manage those simple devices.
· Extra licensing fees – For features you thought were included with the device, but aren’t.
· Lost productivity – for your IT staff.
· Poor user experience – The final metric in the equation to determine the true success of your VDI deployment.
IGEL, of course, offers the solution. But then I would say that! I would urge people to take a look for themselves with our free 90 day trial.
NAKIVO, Inc. is a US corporation, which was founded in 2012. Its co-founder, Bruce Talley, had a long track record of general management and market development, and his vision was to provide businesses worldwide with a solution which would help them protect their virtualized environments and secure themselves against the loss of valuable data.
Since then, the company has evolved, and now it is now a fast-growing company with nearly 100% YoY revenue and customer growth. NAKIVO is highly ranked by the global virtualization community: SpiceWorks rated it 4.9/5, Software Informer named NAKIVO an Editor’s Pick, and TrustRadius scored the company 9.1/10.
NAKIVO’s product – NAKIVO Backup & Replication – is a fast, reliable, and affordable data protection solution for VMware, Hyper-V, and AWS environments. Its development started in Q4 2012, and the first release included an essential set of data protection features. Since then, new releases were launched each quarter, and the product gradually acquired more and more useful capabilities. Currently NAKIVO Backup & Replication is a mature product featuring the flexibility, speed, and ease of integration into an environment no other industry product can provide.
The product is praised by customers for its simplicity and usability, and its easy-to-use, responsive Web interface does not require reading manuals.
NAKIVO Backup & Replication offers quite a variety of deployment options. Customers can install it on both Windows and Linux, create a reliable and high-performance VM backup appliance by installing NAKIVO Backup & Replication on a NAS (now it supports QNAP, Synology, ASUSTOR, and Western Digital NAS), deploy the product as a pre-configured VMware Virtual Appliance, or launch it as a pre-configured Amazon Machine Image. The product installation is incredibly easy and quick, as it requires only one click and takes less than a minute to deploy NAKIVO Backup & Replication with all of its components, and the virtual infrastructure is discovered and added to the product inventory within seconds.
In terms of functionality, NAKIVO Backup & Replication offers an extensive feature set that enables customers to increase data protection performance, improve reliability, speed up recovery, and help save time and money. All features are auto-configured and work out of the box. These are some of them:
When you consider a backup solution, you surely need this product to: be fast to deploy and easy to use and manage; protect your data from loss; provide features allowing to streamline and automate the process of creating backups; allow copying your backups offsite and to the cloud; ensure instant, guaranteed, and easy recovery of your VMs, files, and application objects in case of any failure or a disaster; guarantee the shortest RPO and RTO possible; and save your time, resources, and money. NAKIVO Backup & Replication meets all these parameters and can do even more, thus ensuring the unprecedented protection of your data.
Therefore, over 10,000 customers worldwide are using NAKIVO Backup & Replication in their VMware, Hyper-V, and AWS environments, and the top customers use this software to protect 3,000+ VMs, which span across 200+ locations. Over 150 hosting, managed, and cloud services providers are using NAKIVO Backup & Replication to deliver VM BaaS and DRaaS to their customers.
You can also download a full-featured Free Trial of NAKIVO Backup & Replication and see its advantages.
These are some examples of NAKIVO customers’ success stories:
NAKIVO aims to be 100% channel-based. As of September 2017, NAKIVO has over 2,000 channel partners and a large number of distributors worldwide, and these numbers grow dynamically. All of the partners get large discounts, sales trainings, deal registration, and regular promotions to drive sales.
NAKIVO has a wide geographical coverage and is currently protecting businesses in 124 countries worldwide. These businesses represent a variety of industries ranging from manufacturing and education to airlines. As NAKIVO’s geographical and market presence grows exponentially, the company is not intended to settle and plans to do better and go further.
After all, competitive SMB data protection products do not provide sufficient data protection and recovery capabilities and thus do not get the job done or waste the customer’s time, while competitive enterprise products are overly complex and expensive, wasting the customer’s time and money. NAKIVO Backup & Replication fills the market gap by providing a feature-rich and easy to use solution at an affordable price.
NAKIVO is one of the fastest-growing companies in the industry. The company’s plans for the next couple of years are to expand its market presence and focus on large enterprises. To do this, NAKIVO is going to gradually add new highly demanded features to its product and further improve its UX.
Craig Walker – Director, Cloud Services at communications and networking provider ALE, explores the technologies that will take enterprises to a whole new level of connectivity – linking together previously siloed workforces, processes and systems and taking advantage of new technologies to transform the enterprise into an enterprise without borders.
Business leaders have more progressive expectations than before. Their forward-looking business goals call for integration with new capabilities and technologies — from artificial intelligence (AI) and the Internet of Things (IoT) to bots and cloud services. None can be easily integrated into older platforms. Alongside this, CIOs are seeking closer partnerships with other organisations in their business ecosystem and need to develop ways to support borderless teams where internal employees, external team members, partners and contractors are able to work together and collaborate seamlessly - regardless of location, device, or domain.
Contextual awareness can also bring subject matter experts into the conversation when customer and advisor need specific queries addressing. For example, a connected platform can integrate with existing on-site clinic or hospital equipment to help manage critical, real-time communications - providing essential notification services and alarms for doctors and nurses across a range of devices and platforms.
The same platform could offer real-time video conferencing capabilities for doctors to check up on discharged patients recuperating at home. Some patients may need to be reminded to take their medication after discharge. By integrating a communications platform with the hospital’s processes and the patient’s electronic medical record, automated alerts can be sent via text or phone call to remind the patient.
As a communications platform as a service (CPaaS) is cloud-based, developers can add real-time communications features such as voice, video, and messaging in their own applications without needing to build backend infrastructure and interfaces. A CPaaS with open APIs will integrate with current in-house and third-party apps, providing a separate and secure environment - allowing multiple users to access the platform at the same time.
CPaaS APIs also enable developers to extend connections to stand-alone infrastructures, providing a simple and secure way to bring communication and collaboration capabilities to systems and processes both inside and outside the company borders. They open the door to new collaborative working models based on innovations such as the IoT, AI and task-automating bots.
Open APIs behind communications platforms allow teams to benefit from proactive notification services which incorporate building security devices, operational equipment and even fire safety alarms into one connected communications platform – ultimately saving lives, avoiding production downtimes, or securing buildings.
There should also be a ‘second pair of eyes’ to ensure total platform security. Any CPaaS solution should be put under constant scrutiny by third party software such as the nmap network security scanner, Nessus cloud for vulnerability management, Qualys for cloud security and SSL Labs for server testing, and audited by an external independent company.
There are now a number of vendors stepping away from proprietary offerings by providing open APIs to make it easier for IT teams to develop more apps, safe in the knowledge they can be integrated into existing communications infrastructure. As CIOs prepare their enterprises for greater digital interdependence, a CPaaS platform based on a hybrid cloud can serve as the catalyst to drive digital transformation for businesses - by enabling communication and collaboration both inside and outside the borders of the enterprise.
According to Salesforce, 79% of enterprise IT teams are now involved in building apps for customers, partners or employees. Conferencing, instant messaging and video calling can be added to these apps, bringing employee workflows in line with each other and being a powerful driver for digital transformation within the enterprise.
However, enterprise grade functionality of these apps– be this VoWLAN, call routing, directory services or calendar integration – has to be ‘consumer grade’ in its elegance and ease of use. The connected platform becomes a ‘relationship machine’ that helps IT organisations deliver the services and technologies to transform how people work.
With open APIs, we are truly on our way to a connected enterprise, with platforms that connect workforces, processes and systems to enable real-time collaboration to take informed decisions and actions across the entire organisation. The vision is becoming reality - the enterprise without borders - but with the necessary border control to secure your business during digital transformation.
As long ago as 2015, the analyst firm Gartner ‘retired’ big data from its Hype Cycle – its method of tracking emerging technologies from innovation through to the ‘peak of inflated expectations’, ‘trough of disillusionment’ and the sunnier ‘plateau of productivity’. It reasoned that big data was no longer an emerging technology. Yet, at the same time, it’s said that technologies often take five to ten years to move from the trough to the plateau.
By Sean Harrison-Smith, Managing Director, Ceterna.
While, I don’t for one minute think that Gartner has dismissed big data as worthless, I do agree with its decision. The term ‘big data’ has become so pervasive that instead of an exciting new concept ready to transform sales and marketing as we know it – it’s turned into a cliché.
Yet, we need to get one thing straight; just because the term itself is overused, it doesn’t make whatever it describes less valuable. However, many companies have started capturing large volumes of data, but don’t know how to realise its value in a practical and affordable way. Data scientists who can help analyse the data are expensive to hire and hard to come by anyway and much data is unusable as it stands – incomplete and lacking integrity and context. More often than not, it’s stored in discrete or siloed sources which are difficult to integrate into a single source of information
So like householders addicted to clutter, many businesses sit and watch their data gather dust and wonder why it’s meant to be such a good thing.
But, of course the technology world is moving at such a phenomenal pace that further new concepts are taking centre stage – technologies that can inject data with fresh insight so that instead of forcing us to make decisions based on past behaviour, it can project the way we will act in the future.
Perhaps from now on, we should stop talking about ‘big data’ and start thinking about ‘living data’ instead?
Living data is fluid and is constantly being revised and updated. It’s the reason that the large IT houses such as Salesforce and IBM have heavily invested in artificial intelligence (AI) and tools such as Einstein and Watson; technology that uses significant quantities of data to constantly learn and drive a business’s predictive capacity. So instead of looking at past patterns, it’s looking at predictive ‘living’ trends.
Of course, the idea of AI – or the idea of computers thinking like humans - in itself isn’t new. But previously it was considered to be science fiction and something that could work against humanity rather than a business tool that could be used to move our insight and understanding forward.
Previously there just wasn’t the huge volumes of data available for computers to learn from. Now, with the internet and social media and as time goes on, the Internet of Things we have access to virtually unlimited amounts. This has brought the ideas of Machine Learning and Deep Learning to prominence – enabling computers to learn and refine their output accordingly the more data they use. Further methods such as Natural Language Processing and Predictive Analytics – are also maturing on back of these AI advances.
Data scientists have been called the ‘new rock stars’ because they are so elusive but also badly in demand. Up until recently this meant it was difficult for businesses to analyse their data. Today, emerging new AI tools make it possible to bypass their expertise in all but the most complex of cases.
The other innovation that has made the use of AI for business feasible is cloud computing. Previously the processing power required to crunch the data required would have put it out of reach of all but the largest and wealthiest of businesses. Business data was usually held within a mismatch of internal and external sources – often in systems that didn’t talk to one another. Cloud-based CRM solutions connect all this data to create a single view of each individual customer. This central hub of integrated data is essential to an AI or living data approach.
Today, businesses of all sizes can take advantage of living data when AI is baked into an enterprise system they already use. This doesn’t necessarily mean a “one size fits all” approach – these systems are invariably customised for each individual business anyhow. But intelligence will be embedded within the context of the business, automatically discovering relevant insights, predicting future behaviour and proactively recommending next best actions. It will learn, self-tune and become smarter with every interaction and additional piece of data.
This is already the approach Salesforce is taking with its AI technology, Einstein. This is designed to bring the benefits of AI to the sales and CRM worlds, embracing for example, the huge challenge of social media data analysis, monitoring and assessing to recognise trends, sentiment and relevant events
When AI is integrated into the fabric of the business in this way it becomes a powerful and valuable tool, making businesses smarter and more forward-looking. It will be up to a good systems integrator or vendor partner to help businesses explore the possibilities and customise the solution accordingly.
Computers have not always been able to think like humans – but they are good at doing things that we find difficult – for example, remembering every detail. AI brings these details to life and apply them to real world situations. . In other words, turning big data to living data.
Being an entrepreneur used to depend solely on guts and drive, and perhaps a keen eye for an opportunity, good market awareness, and a significant degree of good luck. Now, a solid grasp of technology has been added to that list, especially if you want to keep ahead of the competition. Consider the world-beating giants such as Facebook and Uber that have risen from entrepreneurial roots in the last ten years, all with technology at their heart.
By Gavin Fell, general manager UK, Exact.
Of course, all business owners and managers strive for success, but there simply aren’t enough hours in a day to achieve perfection. Serving customers, finding new ones, managing staff, establishing processes, admin, expanding your market and world view – all this cuts into your time and potential effectiveness as an entrepreneur.
Then there’s technology. Whether you’re recruiting customers, optimising customer experiences or developing new business models, IT is omnipresent. It’s important to remember that, as an entrepreneur, you don’t have to do all the technology heavy lifting yourself. You can spread the workload, while also benefiting from external expertise, by working with the right suppliers and partners. These are companies that have already invested time and money in trialling and investigating new and emerging technologies to determine their worth and application. It means you can pursue your idea, while your technology partners provide the best technology to make it happen.
This is why technology is a challenge for many entrepreneurs. How do you know what to focus on and what value it offers, and what not to focus on? After all, there are only a finite number of hours in a day.
As a tool for every budding entrepreneur, we’ve assembled an overview of the most important technologies to watch right now, and assessed their entrepreneurial opportunity.
1/ Robotisation
By this, we mean the increasing number of tasks performed by robots that were originally performed by people. Commonplace in the automotive industry, where pre-programmed robotic arms routinely carry out operations with a degree of speed, precision and consistency that humans cannot match, robots have now made their way into office-based roles, such as financial administration. It’s no longer a question of whether robotisation will affect our work, but how much of an impact it will have on employment.
There is no sector or company that won’t be affected by robotisation in the longer term. It’s worth moving process-driven tasks and routine admin into automated hands to improve service and efficiency, leaving humans free to work on more creative work.
2/ The Internet of Things (IoT)
Broadly speaking, the IoT connects a huge variety of devices to the Internet; not only smart watches and mobile devices, but also household appliances and production machines that are equipped with sensors. Benefits include more efficient processes, better logistics at lower costs, centralised management and automation through a constant flow of sensor data.
Preventative maintenance is another example, as devices can indicate when they need maintenance before issues become visible to the naked eye. For entrepreneurs, it’s primarily a question of knowing when is the right time to make a product ‘connected’, and then making a valid business case.
3/ Virtual Reality (VR) and Augmented Reality (AR)
Creating a virtual world viewed through a digital headset can be done in two ways: superimposing a simulated layer over actual reality (augmented reality or AR) or creating a completely new world (virtual reality or VR). Again, many providers are busy developing products and software – perhaps the best-known application is Pokémon GO – but how relevant are these two technologies for companies?
VR and AR will best suit companies that benefit from offering an impressive and innovative customer experience, or those needing a three-dimensional or data-rich view of their production or operating facilities.
4/ Blockchain
The much-touted poster child of new finance, blockchain has gained popularity in recent years. This is best described as a system that automates trust. Its original application was in bitcoin, the digital currency. Everyone who owns or trades in bitcoins is connected to a central database. When ownership of bitcoins is transferred between two people, this is not only registered in the systems of the people involved in the transaction, but in the administration of all who are connected.
It does this in a way that is verified, almost impossible to manipulate or hack, and safe enough to be distributed, rather than centralised. It seems simple, but according to many experts, blockchain paves the way to an almost infinite range of applications. It’s still anyone’s guess how blockchain will take off after bitcoin, but what’s certain is that there are opportunities out there for entrepreneurs willing to take the jump.
What all these technologies have in common is that they depend on scale and standards. That’s not so strange, because these conditions are important for the success of any technological breakthrough. Take the rise of high-definition video, for example, which only gained ground in terms of pre-recorded, high-definition content when mass production was possible and a viable standard (Blu-ray) emerged.
As an entrepreneur, you should avoid betting too early. You should also avoid making your entire play a technology one. Doing so raises the risk considerably that you will back the wrong horse due to a lack of market information. A good entrepreneur knows when and how long to follow a development before jumping in. They also know to make technology a component of the outcome, rather than making it the lynchpin of a project or action.
DW talks to Jean-Michel Franco, Director of Product Marketing for Talend, about the rapidly evolving Big Data landscape.
1. Data migrations are an almost inevitable and crucial part of any Big Data project – what are the main issues that need to be considered?
With the advent of big data and cloud, data migration is evolving into an ongoing activity: customer want to take advantage faster of the latest technology, for example migrate from a relatively old technology in Hadoop such as Map Reduce to a more modern one such as Spark, move from on premises to cloud, or evolve from batch to real time. Cutting the delays and costs of migrations has become very important, which means that enterprise need to define a set of best practices and automatized toolset form migration. Once done, data migration project become low cost initiative that can be delivered in a matter of days allowing the IT systems to take advantage of the latest technologies options.
2. In more detail, presumably it’s a good idea to ensure all affected personnel are fully engaged in the data migration/Big Data project?
Yes. For example, one aspect that is often neglected during migration is data quality. Data quality aspects might hurt data migration project when they are not anticipated. And it is a big pain for people in charge of migration project, because they might be not the one who knows the data best, and therefore are not the best positioned to solve the data quality issues by themselves.
Considering the data quality aspect as a specific sub project is a best practice. Not only it allows to anticipate potential roadblocks for the data migration project, but it gives the opportunity to engage the business users to improve the data quality during the migration process. For example, think about migrating your CRM on premises applications to the cloud, while taking the opportunity to define new data quality SLA in the new system. You could take this opportunity to check the accuracy of customer contact data such as e-mails or telephone, and engage the business to curate the invalid data before it goes into the new system. This increases the outcome of the migration project now that the data quality can improve with the new system.
3. Presumably data governance is a major issue – especially bearing in mind that GDPR is on the horizon?
Yes it is. GDPR is the first cross industry regulations that mandates accountability on data management practices. Before GDPR, we saw that some industry was more mature than other on data governance due to heavily regulated processes. Examples are Banking (with BCBS239, Know Your customers, Basel 3…), insurance (with Solvency) or healthcare (e.g. with HIPAA). Now, GDPR brings the mandate to any industry, no matter their size ( the fine for GDPR is even bigger for SMB as the minimum is 20 Million euros, which is more than 4% of a company’s global revenue for companies with revenue < €500 M). Companies need to establish best practices for data protection and make sure they are accountable of their data assets even if data runs on the cloud or is exchanged with business partners.
4. And quality of data, especially when dealing with legacy data, can be an issue?
Yes, DQ cost companies more than $9M on average according to Gartner. In areas such as sales and marketing, it is estimated that more than 20% of the contact data is not accurate, which means that data quality is killing the benefits of CRM initiative: inaccurate contact data kills your marketing campaign conversion rate; inaccurate profile data kills the benefits of personalization and applying machine learning based on data inference.
Note that data quality is not happening only with legacy data. With the advent of Big Data, IoT and of user generated content, new systems have the potential to bring new sources of data. But this new content brings new challenges that have to be tackled in the sake of data quality.
5. And then there’s the task of trying to discover all of the data dependencies before the migration process throws up problems?
We provide a metadata management platform to address this topic, called Talend Metadata Manager. The product comes with hundreds of connectors to automatically harvest the data structures from multiple applications, such as modelling tools, data bases and files systems, Business Intelligence tools, or data integration tools (not only Talend). Once harvested the tools allows to stitch the links between the different applications for discovering the data dependencies. The metadata can then be reused to automatically populate the data structures within the new system
6. And testing the data migration process before, during and after is crucial?
Yes. Traditional system integration has learned from agile development approaches that came in the digital era: think about a website that might have multiple releases in the same day. We need to achieve the same level agility for IT in the era of cloud and big data. Continuous Integration (CI) emerged as a development practice that provides developers access to a shared repository where they can integrate code, automatically build, test and then deploy there code several times a day. Delivering continuous integration without unified platform is a cornerstone for our data agility strategy.
7. In other words, what do you consider before staying in-house, going third party/cloud, or choosing some kind of a hybrid approach?
As the pace of change is accelerating, we are seeing the value of cloud expanding. See the evolution of big data for example: within Hadoop, some technologies are losing ground, while some new components are quickly gaining traction. Business are mandating to reap the benefits of innovation faster. Topics like real time data, or applying machine learning to Big Data are rapidly evolving from an option to a must. In this context, think about a company that not yet started his Big Data journey. Creating a platform by themselves would take month, compared to a cloud option were they can take advantage not only on on-demand infrastructure with unlimited capabilities and pay per use, but also on some ready to go sophisticated services for data management, data processing, machine learning, IOT management or analytics. Then there’s a need to innovate on a continuous basis and to reap the benefits of pay per use when it brings significant cost reductions.
We see many of our customers embracing cloud first approach, while maintaining the option for hybrid approach whenever needed. A great example is our customer Lenovo who is running most of their customer analytics and Big Data applications on the cloud. However, when the data relates to their customer personal information, they considered that they should have total control over their data on a private cloud.
8. Choosing the right migration tool(s) is very important?
Gone are the days when you migrated your application every twenty years. Migrations has evolved from an event based project to a continuous change exercise that you need to run on a regular basis. This needs disciplines, best practices and tools, and company are realising that they need to set up their standards for well managed migrations.
9. In summary, what are the pros and cons of keeping Big Data projects in-house versus heading for the Cloud and/or using some kind of software-as-a-service solution?
The value of cloud for Big Data projects is evolving drastically over time. Think about the high value service for dealing with real time data, cutting costs with pay per use model and more flexible architectures, integrating machine learning, etc. It will become harder and harder for IT service to provision the same breadth and depth of services by their own. That is way many companies are opting for a cloud first strategy, while making sure that they are not locking themselves into a specific cloud. Cloud first strategy doesn’t mean that company unplug there on premises systems or doesn’t opt for this option on some cases. It is still an option but not the default choice anymore.
10. In more detail, can you talks us through the Data Fabric platform - this was recently updated?
Talend created a platform, the Talend Data Fabric to help companies become best-in-class data-driven enterprises.
Talend Data Fabric combines all Talend’s products into a common set of powerful, easy-to-use tools for real-time or batch, data or application integration, big data or master data management, on-premises or in the cloud.
It helps companies increase data agility and speed time to market and meet the latest data challenges with a solution that’s 5 times faster and 1/5 the price of legacy integration solutions. Through its integrated self-service capabilities for data preparation and data stewardship, it allows any users, including data experts and business users to turn their daily activities into data driven processes and collaborate for data curation and protection.
11. And Talend offers Big Data Integration and Real Time Big Data solutions?
Yes, Talend can connect to any Big Data Platform, on premises or cloud. Even more importantly, it runs natively on Hadoop, which means that is can take advantage of the impressive power of Hadoop, from data processing at extreme scale to machine learning with Spark or managing data in motion with Spark streaming. What makes Talend unique is that it helps company to reap the full benefits of those platform while using a no coding, visual development approach. This allows developers to design their application 5 times faster and to re-platform their development to the newest technology in one click. It also turns potentially any developer (or even business users, see self-service section) into a big data experts
12. And then there are the Data Preparation and Data, Application and Cloud Integration tools?
In addition to Big Data Integration, Talend provides traditional data integration for batch data flows. It also provides application integration for real time integration within the same unified platform. Because Talend is a code generator, data flows can run everywhere without prior setup, deployment, or installation. In particular, Talend can run on any cloud and even supports multi-cloud strategies where data need to be moved and processed across clouds. Last, Talend Integration cloud is a secure cloud integration platform-as-a-service (iPaaS) to integrate all your cloud and on-premises data. Talend Integration Cloud puts powerful graphical tools, pre-built integration templates, and a rich library of components at your fingertips.
13. And Talend offers a Data Quality product?
Rather than delivering data quality as a stand-alone product, Talend chose to embed its data quality capabilities with his unified platform. This means that data quality can run everywhere, at massive scale within a big data cluster, on top of enterprise application like salerforce.com, or to filter and standardize streaming data from the internet of things.
Talend Data Quality profiles, cleanses and masks data, while monitoring data quality over time regardless of format or size. Through data de-duplication, validation, standardization, and enrichment you create clean data for access, reporting, and analytics. Integrate external reference data sources for postal validation, business identification, credit score information, and more.
14. Finally, what does the Master Data Management solution do?
Talend Master Data Management (MDM) unifies all data—from customers to products to suppliers and beyond—into a single, actionable “version of the truth.” As part of the Talend Data Fabric, Talend MDM combines real-time data, applications, and process integration with embedded data quality, stewardship and self-service apps to share across on-premises, cloud, and mobile applications.
15. With AI and IoT just gaining momentum, how important is it that end users establish some kind of a data integration/Big Data strategy ahead of this data explosion?
On one end, there are more and more data points that be leveraged. On th e other end, more and more opportunities to turn them into real time insights or actions. The physical world is now connected in real time to the digital world to sense and respond to events. But the prerequisite to reap the benefits is to draw an end to end information supply chain to turn raw data in smart data. This is the ultimate goal of data integration big data strategy. And this also mandates for well-defined data governance strategies to deal with the dark side of data, now that data is everywhere, subject to leaks and violation of data privacy, and can lead to wrong outcome when it not accurate.
16. Bearing this in mind, where’s the right place to start such a project – understanding the data you already have, deciding what data you need to acquire…?
There’s no one size fits all approach, but, with the advent of big data, it has become more and more easy to start with the outcome, and capture the data you need for the outcome. See how leading GPS companies such as Waze has transformed the driving experience by turning each of their users into a sensor. This helps them to have a real time always accurate view of the status of the traffic.
With this kind of approach, the goal is not to understand the data you have but rather to find ways to can the data you need, but don’t have yet;
17. Finally, what can we expect from Talend over the next 12-18 months in terms of the products and services that will be developed to help customers address data migration/Big Data challenges?
Expect our technologies to be smarter with the use of machine learning. We already started to leverage machine learning so than our product understands the meanings and the connections between data by learning from data experts. Now that our product is running in the cloud, we have the ability to learn how data experts are using them to improve from
How businesses are using APIs to create the future of IT infrastructure
By David Grimes, VP of Engineering at Navisite.
For many businesses, improving the quality of products and services while increasing efficiency and reducing costs is a key challenge. And in many businesses, the IT team is now at the forefront of this challenge. Often, IT departments, along with the cloud computing industry and IT service providers, are increasingly looking to APIs (or Application Programming Interfaces – sets of routines, protocols, and tools for building software applications[1]) to achieve this by enabling automation throughout the IT stack.
Automation in IT processes is emerging as a key enabler for achieving more reliable processes and greater efficiency – and these automations are being enabled by APIs.
APIs can be programmed to perform monitoring functions, such as waking up periodically to check that a URL is available. A response can then be instrumented directly into the platform. For example, if the programme detects that a URL isn’t working, the API might do some forensics and see that the web server service is not running. The API can then automate restarting the service through a further API call. This degree of automation distinguishes itself from historic usage, where APIs would be used only to generate an alert that would create a message, but still require a human operator to log-in and take an action.
Importantly, automation through APIs also ensures consistency. Businesses can remove the human error from operational processes through their automation. Even when routine tasks are well-documented with clear processes, when human workers perform the task there will likely be some variation in the outcomes. However, if the same task is automated, it will be performed in the same way every time, improving operational reliability and, in turn, operational efficiency.
Use of cloud APIs to achieve greater efficiency and consistency in workflows also extends into DevOps procedures. Here, APIs allow for more dynamic systems that can scale widely from an application-down perspective. For example, instrumentation in your application that provides visibility to an orchestration layer can detect when more capacity is needed in the web or application tier. The orchestration layer can then come back to the APIs provided by the infrastructure and begin spinning up a new server, adding it to the load balancer pool through the same APIs to increase capacity. Similarly, systems built on APIs can then also have the instrumentation to detect when they are overbuilt, for example at night, and can then automatically wind down unnecessary web servers in order to reduce costs.
The ability to automate the powering-on of development and testing environments at the beginning of the business day and powering-off at the end of the business day, has the potential to help businesses realise huge savings on their hosting costs – in some cases, up to 50-60 per cent. Ultimately, using APIs to automate DevOps and operations can deliver multiple benefits, blending the ability to optimise processes for cost, performance and deep, app-level visibility.
In an age where we are generating and capturing more data than ever – with some estimates predicting we’ll create 163 zettabytes of data by 2025 – APIs are also proving highly useful in automating reporting procedures, which enables teams to tap into undiscovered data assets. IT teams need to consider how to make those datasets available in order to build a dynamic reporting engine that can potentially be configured by the end user, who will be the person that understands the nature of the information they need to extract from the data.
This is frequently accomplished through APIs. IT teams and application services providers can use APIs to build systems that process the data and make it accessible to end users immediately, so that they do not have to go through a reporting team and do not lose any of the real-time value of their data.
There are few areas in IT where seamless processes matter as much as they do in disaster recovery (DR), which is why the benefits of automation through APIs make them a crucial part of modern business continuity plans. In the modern world of highly virtualised infrastructure, APIs are the enabler for the core building blocks of DR, in particular replication, which can be driven from the APIs exposed by the virtualisation platforms. The final act of orchestrating DR failovers, is therefore often highly API dependent.
Disaster recovery is one specific use case of how APIs enable efficiency and operations automation. Humans make mistakes, and processes become very difficult to maintain and update, which is why a DR plan based on humans executing processes is not a foolproof option to ensure the safety of your business in the event of a disaster. Kicking-off DR can be likened to “pressing the big red button”, and if you can make it one button that kick-starts a set of automated processes, this will be a far more manageable and seamless execution than thirteen different buttons – each of which has a thirty-page policy and procedure document that must be executed during a disaster.
Despite the clear benefits of API-enabled automation and technology, the broader IT industry has not yet fully realised its potential. Ironically, this is particularly true for industries that have been leveraging information technology for a long time. In these industries, we are seeing a critical mass of legacy applications, legacy approaches to managing infrastructure, and legacy staff skillsets.
As a younger generation comes into the IT industry, we will likely move towards more comprehensive API use and maximise the value of APIs: this generation that has grown up and learned with them. As we see disruptors displace incumbent packaged software players and new entrants entering the enterprise IT community, we are likely to see more pervasive leveraging of the benefits which APIs bring – particularly when it comes to making full use of their cloud infrastructures. However, this will take time, and we may be one to two full education cycles away from producing and maturing enough IT professionals with the education and training required to make full use of the opportunities offered by APIs.
APIs are also reducing the cost of developing new ideas, as part of wider cloud computing solutions. Companies looking to innovate no longer need to make large upfront investments in equipment to get new ideas off the ground. They can quickly start their business on Infrastructure-as-a-Service platforms and use APIs to control and power systems down to reduce costs as needed. As the new product or service grows, organisations can quickly scale on the same cloud infrastructure. However, for the use of APIs to truly cut costs, they must be part of cloud deployments, rather than a pricey addition.
In the future, we're likely to see even more innovative uses of APIs to drive automation, consistency and efficiency, as businesses continue to try new ways of working. In order to remain competitive, it is key that businesses make full use of new API-enabled software and other technologies in order to fully realise the increased efficiencies that they offer.
The pressure is therefore on IT to deliver excellent digital experiences and this can be a challenge. Below, Paul Higley, Regional Vice President, UK and Ireland, Riverbed Technology, shares his five reasons why your digital experience management (DEM) strategy might not be as effective as you’d like it to be.
The digital world is all around us, always on, and ready and waiting on engagement. In fact, every consumer interacts with brands through some form of digital channel, every day. Digital is also driving customer retention and ultimately sales, so it’s of little surprise that CEOs are keeping a very watchful eye on the delivery of excellent digital experiences. And, according to Gartner[1], CEOs are more focused this year on how technology and product innovation drive company growth than in years past.
1. Application complexity is reality
A recent Gartner survey shows that CEOs are relying on technology to drive growth. The very same survey also shows that these CEOs rank technology impediments as the number two internal constraint to achieving growth goals. The question is, ‘how can technology be both a driver of growth and an impediment to it?’ The answer lies in application complexity.
The application landscape is increasingly complex, and so is the delivery of seamless digital experiences. This is because applications must meet so many more demands today than even just five years ago. For example, applications must address users’ constantly evolving needs, scale based on demand, and remain highly responsive 24/7 across geographies. These innovative applications must also co-exist, and even interact with legacy applications. This means that IT must support not just the new, but the full range of enterprise applications — web, mobile, and apps running in the cloud, on virtual infrastructure, and legacy client-server environments.
At the same time, end users and customers no longer interact with static applications at discrete times. They interact continuously with applications whose architectures have evolved to become modular, distributed, and dynamic — all of which adds further to the complexity of the application landscape.
2. Multi-user universe
The digital experience of customers is of the utmost importance. But, so is the digital experience of your employees, suppliers and partners — all of whom can help you grow or become detractors. If ensuring all these audiences’ digital experiences weren’t enough of a challenge, the advent of IoT requires IT to ensure an excellent digital experience for machines as well!
Having so many users to satisfy can seem like a daunting task. The easiest way of doing this, for most businesses, is by starting to focus on their most important population of users. For B2C companies, more often than not, the customer is the top user group. B2B companies on the other hand, often focus on the digital experience of the workforce.
For both types of organisations, there are big benefits to optimising the digital experience of the right stakeholder group to help drive, for example, productivity, sales, engagement and streamline customer services. The bottom line is that you have to start somewhere, and it should be with your most important audience.
3. Know who you need to work with
According to a recently published report from analyst firm EMA, 59% of enterprise leaders agree that IT and the business share the responsibility for DEM[2]. In addition, the responsibility often lies at senior levels of the organisation. Whilst it’s comforting to know you are not alone when it comes to making decisions about choosing the right DEM strategy — having opposite sides of the business contributing to these decisions is not always easy.
On the technology side, it’s the IT executive suite (CIOs/CISOs etc), and on the business side, it’s the head of marketing or digital services — driving decisions about DEM. Although they share responsibility for ensuring excellent digital experience, these two groups within an organisation will have specific needs that vary greatly, depending on their roles and goals.
The key to success is for all sides to meet each other and engage in the process. It is critical for IT to understand the overall business objectives, and then to work with the business to set goals that will help achieve great outcomes. Equally the business needs to understand that technology, whilst a marvellous tool in the pursuit of growth, is not a silver bullet — as it is more complex than just downloading the latest app and making it work.
4. Finding the right way to measure success
Management expert Peter Drucker once said, “you can’t manage what you can’t measure.” This stands to be especially true when it comes to tracking the success of a DEM initiative.
The IT and business groups above must define metrics for success and track them over time. To serve such a broad range of users, DEM tools must supply a broad set of business and technical analytics, such as application performance, network capacity analysis, end user productivity, application transaction volumes, and customer and partner interactions across the extended enterprise. It’s therefore critical to choose the right tools from the outset and ensure that they can deliver reports for each user group in a format that is both accessible and actionable.
5. Visibility is key to risk mitigation
As IT organisations respond to CEO priorities and roll out new services to drive growth, they need a cross-domain understanding of applications, the networks and infrastructure on which they run, and the impact they have on end-user experience. However, with all of the approaches to monitoring digital experience, applications and infrastructure, IT teams still lack this comprehensive visibility over all these elements. In fact, it is reported that the typical enterprise has between four to fifteen different network monitoring tools, which complicates troubleshooting, change management, and other aspects of service level management. Yet, while these tools provide insight into the performance and availability of their particular domain, they lack visibility into the actual digital experience of customers, the workforce, partners and suppliers.
This fragmented approach to DEM visibility only creates more risk. The IT and business decision makers therefore need to work together to ensure specialised monitoring tools provide insights that do not create silos, but rather enable all parties to know what is going on and enable each team to help the other to achieve the overall business objectives.
An effective DEM approach should close the visibility gap between what IT monitoring tools show and what customers and the workforce are actually experiencing. Each group within IT and the business should receive, and understand, the metrics and analytics they need to ensure a successful digital experience outcome.
Ultimately, when it comes to meeting or exceeding the big boss’ expectations for driving growth, the key is to ensure you have an effective DEM strategy — with all sides of the business working together. Failing to do so could mean lost revenue, lower productivity, and even irreparable damage to a company brand.
The digital revolution has brought a wealth of performance-improving technologies and platforms to businesses. Specifically, smartphones and tablet technology have introduced a raft of opportunities, in particular when it comes to driving efficiencies and effectiveness.
By Rob Mannion, MD of RNF Digital Innovation.
Back in 2016, Gartner predicted that large enterprises could have up to 100 mobile apps within the business by the end of 2017 but that this demand for apps outstrips available development capacity, making quick creation of apps challenging for businesses.
The Three E’s
The benefits of overcoming this challenge are significant, as highlighted when looking at the Three E’s (Efficiencies, Error-Reduction and Effectiveness) – a concept first introduced by Richard Marshall, a senior Gartner analyst.
A successful mobile app will make a difference to a business in all these areas to a greater or lesser extent. For example:
Efficiencies
Error-reduction
Effectiveness
Maximising ROI and integrating systems
While the potential benefits apps offer businesses are clear, only those mobile apps that are properly thought through and implemented across traditional departmental borders maximise ROI.
Rolling out a new digital app to replace or work alongside a core process within an enterprise has the potential to cause a level of organisational disruption if the business fails to plan for its deployment accordingly. For example, an app development programme can easily grind to a halt if a business is unable to get appropriate APIs (Application Program Interface) sorted with existing business software.
The best way to overcome this and ensure the app achieves the desired ROI is to consider running two projects concurrently in order to mitigate the risks:
The second project identifies how and where processes or procedures need to change as a result of the app; working with departments across the business, such as IT, Sales, Marketing and HR. The ultimate objective is to outline, document and agree a new set of processes with all parties, without which any app implementation will at best deliver only a fraction of the benefits expected and, at worst, completely fail.
CASE STUDY: Calor Gas Field Pro - creating efficiencies and driving cost savings
Brief
Calor Gas – Britain’s leading supplier of LPG (Liquified Petroleum Gas) with a field sales team of over 100 staff – was one of the first British enterprises to implement an ambitious mobile transformation programme targeting its sales force. The company wanted to develop an app to drive efficiencies and operational improvements, running two projects, side by side, to ensure the best possible ROI.
Process
Engaging the sales, legal, installation, engineering, marketing, I.T. and finance departments, as well as the CEO and the Board in the planning process, the team was able to identify challenges and areas of mutual benefit that informed the app’s design.
Solution
The solution was a sophisticated iPad app, specifically developed to assist and boost Calor’s commercial and domestic cylinder and bulk business, which made the most of digital technologies to streamline the sales process: from site visits to data entry; contract signatures to delivery details.
For example, where the sales force previously had to draw out a paper plan of a site (for example, someone’s garden for a domestic customer or factory grounds for a commercial customer), show them where the tank would sit, take that back to the office, hand it over to the customer operations team, then the engineering team who would take it back on site for installation – creating many opportunities for the drawing to become damaged or lost and causing delays – with the app, the team was able to use a digital drawing tool to create the image, which could be easily shared and replicated with the relevant departments.
Results
The app created numerous process efficiencies, generating cost savings that surpassed Calor’s expectations. End-to-end signup to fulfilment time was at least halved for all orders, with some order types becoming over four times faster.
By simplifying the sales process, increasing the accuracy of the data capture and eradicating many sources of errors, measurably more deals were signed than in previous periods while delivering significant efficiencies and cost savings and fundamentally improving customers’ experience of the sales process.
A significant amount of planning was also required to ensure the new system integrated seamlessly with established I.T. systems. By engaging end users in the design from the outset and ensuring all parties received appropriate training, adoption of the app went smoothly and was well received. Rather than being a standalone app for the sales team, it is now viewed as a platform to be built on across the company, facilitating maximum ROI.
This success was only possible because Calor engaged all relevant departments in the process, implementing a business transformation exercise alongside the app development itself. This identified a wide array of improvement opportunities while ensuring company-wide standardised processes were developed to facilitate their implementation.
Planning for disruption
While app development may initially be disruptive, the deployment of a mobile app should ultimately invoke a level of change for the better across the business. This only happens if the apps are well designed, the business is ready to adapt both procedurally and culturally, the initiative has support from senior management and the project factors in relevant education or training for the end users.
To commission and deploy enterprise apps effectively, companies should: